Gensler Defends SEC’s Predictive Analytics and Conflicts of Interest Proposal

Artificial intelligence still presents unique systemic and compliance risks that are not easily disclosed, he argued.

Artificial intelligence poses unique systemic and compliance risks, the chairman of the Securities and Exchange Commission said during a speech at Yale Law School on Tuesday.

Gary Gensler, according to his prepared remarks for the event, advocated for an SEC proposal from July 2023 that would require advisers to eliminate or neutralize conflicts of interest, as opposed to the current standard of mitigating and disclosing them, as they relate to the use of predictive analytics technology.

Gensler explained Tuesday that artificial intelligence can increase the opportunities for fraud and help fraudsters “exploit the public.” AI is subject to current regulations governing advisers, such as the requirement that advice must be tailored to the needs of a client. But the predictive technology must be constantly tested and monitored, and computer learning systems (including the large language models used to train AI systems) would need special “guardrails” to keep them in compliance, since they can adapt over time.

“AI models’ decisions and outcomes are often unexplainable,” Gensler lamented. “AI also may make biased decisions, because the outcomes of its algorithms may be based on data reflecting historical biases.” Additionally, he continued, AI can lead to monolithic and correlative responses to market stimuli if advisers are all relying on similar models, and “that can lead to systemic risk.”

Though Gensler only hinted at it during the speech, he has said explicitly at other venues that AI is uniquely unfit for a disclosure-based regime, and therefore conflicts arising from its use must be eliminated. This is because the datasets used to train predictive models can be enormous and difficult to summarize and the model itself can evolve over time. As a result, advisers may struggle to disclose the nature and scope of conflicts arising from its use in a manner that is useful to investors.

The investment industry has been near universal in its disdain for the proposal. The primary objections include the concern that a full elimination of conflicts is not possible and that the definition of “covered technology” in the SEC’s proposal is broad to the point of caricature and could include items such as chatbots and ordinary calculator tools.

The point about broadness seemed to be well taken by the SEC: William Birdthistle, director of the SEC’s division of investment management, and one of the leading proponents for the proposal, testified to Congress in September that the SEC was familiar with public comments on this issue and was looking closely at the definition of conflicts to ensure that only technologies that are truly predictive in character would be included.

However, Gensler’s remarks suggest the SEC might be less sympathetic. He noted that push notifications generated by predictive models can introduce a conflict, and even something as subtle as using the favorite color of the client in the notification could be the basis of a conflict: “Are firms communicating with me in a color other than green because it’ll be good for my investment decisions, or because it might benefit the firm’s revenues, profits or other interests?”

Dan Gallagher, the chief legal, compliance and corporate affairs officer at Robinhood Markets Inc., speaking at a Securities Traders Association conference in October 2023, identified the issue of how push notifications are affected by the proposal. He noted that notifications describing price movements to clients could be subject to the rule if the SEC determined such notifications were a “call to action” or inducement to trade.

Based on Gensler’s comments on app notifications, he might agree.

«