New software developed at the University of Waterloo aims to give insight to regulators and financial firms into the predictions made by self-teaching algorithms that are otherwise something of a mystery from the outside, the Waterloo, Ont.-based institution announced on Friday.
The software is designed to analyze decisions made by so-called deep-learning artificial intelligence (AI) algorithms, which can be viewed as impenetrable “black boxes” by outside observers.
The goal of the project is to create an interface that, “make(s) it easier to adopt and trust powerful artificial intelligence (AI) systems that generate stock market predictions, assess who qualifies for mortgages and set insurance premiums,” the university says in its announcement.
The hope is that the software would be able to provide regulators and analysts with confidence in the recommendations generated by algorithms, by providing insight into the algorithm’s predictions that can be weighed against real-world experience to help determine whether the predictions seem to make sense or not.
Currently, these algorithms, “essentially teach themselves by processing and detecting patterns in vast quantities of data. As a result, even their creators don’t know why they come to their conclusions,” the university says.
“If you’re investing millions of dollars, you can’t just blindly trust a machine when it says a stock will go up or down,” says Devinder Kumar, lead researcher and a PhD candidate in systems design engineering at Waterloo, in a statement. “This will allow financial institutions to use the most powerful, state-of-the-art methods to make decisions.”
Researchers at Waterloo first created an algorithm to predict movements on the S&P 500 stock index. They then developed software, called CLEAR-Trade, to examine the predictions and produce graphs and charts that highlight the factors that the AI system relied on most heavily. The researchers expect to start field trials of the software within a year.
“The potential impact, especially in regulatory settings, is massive,” adds Kumar. “If you can’t provide reasons for their decisions, you can’t use those state-of-the-art systems right now.”
Photo copyright: sakkmesterke/123RF