Artificial intelligence will reshape the world of finance over the next decade or so by automating investing and other services—but it could also introduce troubling systematic weaknesses and risks, according to a new report from the World Economic Forum (WEF).
Compiled through interviews with dozens of leading financial experts and industry leaders, the report concludes that artificial intelligence will disrupt the industry by allowing early adopters to outmaneuver competitors. It also suggests that the technology will create more convenient products for consumers, such as sophisticated tools for managing personal finances and investments.
But most notably, the report points to the potential for big financial institutions to build machine-learning-based services that live in the cloud and are accessed by other institutions.
“The dynamics of machine learning create a strong incentive to network the back office,” says the report’s main author, Jesse McWaters, who leads the AI in Financial Services Project at the World Economic Forum. “A more networked world is more vulnerable to cybersecurity risks, and it also creates concentration risks.”
In other words, financial systems that incorporate machine learning and are accessed through the cloud by many different institutions could present a juicy target for hackers and a single point of systemic failure.
Wall Street is already rapidly adopting machine learning, the technology at the center of the artificial-intelligence boom. Finance firms generally have lots of data and plenty of incentive to innovate. Hedge funds and banks are hiring AI researchers as quickly as they can, and the financial industry is experimenting with back-office automation in a big way. The automation of high-frequency trading has already created systemic risks, as highlighted by several runaway trading events, or “flash crashes,” in recent years.
Andrew Lo, a professor at MIT’s Sloan School of Management, researches the issue of systemic risk in the financial system, and he has previously warned that the system as a whole may be vulnerable because of its sheer complexity.