California Governor Gavin Newsom has vetoed SB 1047, a bill aimed at regulating high-risk AI models.
The proposed legislation would have required companies developing large AI systems to create and publish safety protocols, subject to external audits.
Newsom argued that while the bill sparked important discussions about AI risks, it was not the right approach to protect the public. He said the legislation's focus on model size and computational costs, rather than actual risks, could give a "false sense of security."
"While well-intentioned, SB 1047 does not take into account whether an AI system is deployed in high-risk environments, involves critical decision-making or the use of sensitive data. Instead, the bill applies stringent standards to even the most basic functions — so long as a large system deploys it", Newsom wrote in his veto message. "I do not believe this is the best approach to protecting the public from real threats posed by the technology."
Newsom says AI regulation "must be based on empirical evidence and science"
The key issue is whether the threshold for regulation should be based on the cost and number of calculations required to develop an AI model, or whether the actual risks of the system should be assessed independently of these factors.
The Governor called for an evidence-based approach to AI regulation that addresses real-world risks and use cases, stressing that effective oversight must keep pace with rapidly evolving AI technology.
"Smaller, specialized models may emerge as equally or even more dangerous than the models targeted by SB 1047 - at the potential expense of curtailing the very innovation that fuels advancement in favor of the public good," Newsom wrote.
Newsom's veto halts California's bid to become the first U.S. state with comprehensive AI model regulations. However, the debate over how best to manage AI's rapid development is likely to continue.
OpenAI strongly opposed the bill, arguing AI regulation should happen at the federal level. Anthropic, despite some reservations, supported it as a step toward risk prevention.