SEC Warns Artificial Intelligence Could Trigger “Almost Inevitable” Financial Crisis
Unemployment, demise of human creativity, plagiarism, extinction of humanity—what else do we need to worry about when it comes to advanced artificial intelligence? Yes, a financial crisis is “almost inevitable,” says the chairman of the U.S. Securities and Exchange Commission.
Chairman Gary Gensler of the U.S. Securities and Exchange Commission (SEC) told the Financial Times of the UK that the increasingly widespread use of artificial intelligence systems is almost certain to lead to a financial market collapse within the next decade.
Gensler warned that a crisis is almost inevitable due to reliance on artificial intelligence models developed by tech companies. He also criticized the lack of diversity in artificial intelligence tools currently used by financial institutions for monitoring markets, providing advice, and automated account opening.
Gensler suggested that the solution lies in introducing regulations to oversee generative artificial intelligence models and how Wall Street entities use them, as these entities have been adopting this technology en masse since the beginning of this year. However, the head of the SEC admitted that this will be a “cross-regulatory challenge.”
“Frankly, it’s a tough challenge,” Gensler told the Financial Times. “It’s a difficult financial stability issue because much of our regulation is aimed at individual institutions, individual banks, individual money market funds, individual brokers; that’s the nature of what we do. It’s a horizontal [issue]; many institutions may rely on the same underlying base models or underlying data aggregators.”
This scenario painted by Gensler is not the first time technology has led to a financial market crash. As early as 2010, a British trader sent a large number of fake orders from his parents’ basement in London to the Chicago Mercantile Exchange, illegally manipulating the market and triggering a “flash crash.” This caused nearly $1 trillion in U.S. stock market value to evaporate, then bounce back immediately. Regulators stated that high-frequency trading algorithms were one of the causes of the crash.
So far, AI companies have agreed to self-regulate and manage the risks brought about by their technology, but the government is calling for stronger regulation. The EU is drafting an “Artificial Intelligence Act” that may require developers of generative AI tools to submit them for review before a full release. Meanwhile, the U.S. government is still reviewing the technology to determine which aspects need regulation.
In July, the U.S. Securities and Exchange Commission (SEC) proposed new rules requiring brokers and investment advisors to take certain steps to address conflicts of interest related to the use of predictive analytics in interactions with investors. The aim is to prevent companies from placing their interests above those of investors.ShareSave