In early 2021, ordinary retail investors mounted an assault against Wall Street hedge funds. Mobilizing on Reddit and relying on user-friendly trading apps like Robinhood, amateur investors sparked a short squeeze in the market for video game retailer GameStop’s
GME
The regulatory scrutiny culminated last July, when the SEC proposed new rules targeting conflicts of interest in financial firms’ use of “predictive data analytics” (PDA). PDA refers to technologies that analyze financial data and make predictions about market movements and investor behavior. These include technologies like robo-advisors, stock analysis algorithms, and other AI-driven tools used by broker-dealers and investment advisers to inform their investing strategies and recommendations to clients.
Absent such new regulations, SEC commissioner Gary Gensler says it is “nearly unavoidable” AI will cause a financial crisis in the next decade. That’s according to a recent interview with him in the Financial Times. But will technology inevitably lead to booms and busts in stock prices? Don’t be so sure.
In the SEC’s proposal, it invokes behavioral psychology to make its case. The agency argues technology can induce excessive trading, exploiting investor biases and subtly “nudging” them to behave in a herd-like fashion. However, the SEC presents no real evidence these problems are widespread, as this would require the SEC knowing not only how much trading is “excessive,” but also when investments are made “rationally” versus not.
Ironically, GameStop’s stock surge in 2021 was far from a passive reaction in response to technology. Investors deliberately coordinated using technology as a means to stick it to short sellers, most notably hedge funds, who were betting against a beloved retailer.
Adding to the irony is the SEC’s regulatory response is itself reactionary, seemingly driven by political pressure, as opposed to being a measured response to a proven, systemic problem. Rather than accuse ordinary investors of irrationality, regulators might want to look in the mirror and assess how behavioral biases like social desirability bias, which is the tendency to express support for proposals viewed as popular, or availability bias, the tendency to overweight the significance of recent, high-profile events or experiences, helped shape this rulemaking.
The SEC’s proposed solution is to mandate firms write policies, follow procedures, and keep written records pertaining to PDA conflicts, as well as to eliminate any conflicts found. Initial compliance costs exceed $400 million according to the SEC’s own estimates, with more costs occurring thereafter. But the anecdotal nature of the GameStock saga suggests the benefits of this regulation will be sporadic at best. Indeed, the SEC didn’t even bother to calculate any benefits in its analysis for the rule, perhaps because there aren’t any to be reported.
As my colleague John Berlau and I noted in a comment letter to the SEC, “the plural of anecdote is not data.” Maybe after gathering reliable evidence demonstrating the need for a new regulation, a proposal might be warranted from the SEC. But any new AI rules should stem from comprehensive evidence collection, examination of alternatives, and cost-benefit analysis. Absent such due diligence, the SEC risks cracking down on beneficial new technologies that, if anything, democratize trading beyond the confines of the Wall Street elite.
The public deserves policies founded on rigorous, unbiased analysis—not speculative ivory tower theorizing about irrational investing. No one makes perfect decisions. The SEC’s proposal shows that it too is not immune from imperfect decision making.
Read the full article here