The evolution of artificial intelligence (AI) in the financial services sector represents a double-edged sword for clients: The technology could be used to nudge them toward smart financial decisions or to exploit their ignorance, according to experts who participated in a panel discussion at the Ontario Securities Commission’s (OSC) annual policy conference in Toronto on Thursday.
The panellists who participated in the AI: A Game Changer for Financial Services? discussion during OSC Dialogue 2017 focused on the emergence of AI as potential force for both good and evil within the financial services sector. Specifically, in the initial stages, AI was largely being used to help companies detect possible fraud among clients, but the technology’s capability is rapidly changing and becoming more powerful.
Notably, during the past 18 months, AI technology has evolved from the relatively primitive stage at which it had to be trained to analyze a set of data to now have the ability to train itself to analyze data in real-time, according to Erin Kelly, president and CEO of Ottawa-based Advanced Symbolics Inc. Such a development could allow regulators to identify improper short-selling behaviour almost immediately, for example. At the same time, this self-training capability avoids the problem of injecting bias into the AI and it helps guard against novel bad behaviour.
Yet, as the technology grows more powerful, it also has greater potential to be used to exploit consumers. Doug Steiner, a financial services sector veteran who now heads a Toronto-based startup, Evree Corp., points out that one of the first uses of AI in financial markets was to allow professional traders to trade against uninformed retail investors — identifying clients who routinely lost money trading and automatically taking the other side of their trades.
Now, financial services firms are using client data to distinguish their financially savvy clients from their less sophisticated clients and then sell high-margin products to the ones who don’t know any better, he says. For example, Steiner notes that there are five million Canadians paying high rates of monthly credit card interest and yet 20% of those have savings at the same bank.
“There is no bank in this country that will use artificial intelligence, or any intelligence, to tell you pay off that card with your savings,” he says.
Indeed, Steiner notes that, currently, the data on consumer financial behaviour is largely in the hands of banks that have no incentive to help people become smarter with their money. His new company is trying to take the behavioural data that financial services institutions are collecting on their clients, and using to sell them products, and give those data back to clients in ways that will enable them to make smarter financial decisions.
“We have to teach people how to become investors, and the only way to do that is to get people to spend less and save more money,” he says.
Kelly points out that data, which reveals consumers’ financial habits and intentions, is now being collected from their online activity as well, and that people need to understand how companies are using their information.
“If you’re going to be posting things on social media, know that AIs are going to be reading it,” she says. “We have to educate our kids on how to be personally responsible, and not to be expecting that other people are going to be looking out for your best interest, because no one is going to care about your best interest more than you will.”
Although Kelly stresses the importance of self education and responsibility, another panellist, Janet Bannister, general partner with Montreal-based venture capital firm, Real Ventures, expresses some hope that companies will increasingly use their data on consumers and AI to help their clients rather than exploiting them — if only because it’s in the firm’s self interest.
There are firms that are starting to recognize that their long-term survival is likely dependent on their willingness to look out for the best interests of their customers, she notes, even if that means sacrificing short-term revenue.
Jake Tyler, co-founder and CEO of Vancouver’s Finn.ai, echoed that sentiment, adding that financial services firms that aren’t putting clients first will ultimately find themselves at risk.
“If you’re a bank, and you have someone else acting as that ‘trusted personal financial advisor,’ the bank is in big trouble there, because the bank is now just a utility service,” he says. “So, if you’re a bank, I would be pushing as much as possible to be that trusted advisor.”
Financial services firms have to ensure that they can be trusted by clients to serve their best interests, Tyler says, otherwise, “There’s fundamentally a huge amount of value at risk there.”
Photo copyright: masterart/123RF