California is close to passing the first state law in the US that sets specific safety rules for so-called AI companion chatbots.
The bill, SB 243, has cleared both chambers of the legislature and now awaits Governor Gavin Newsom's signature. If signed, the law would take effect on January 1, 2026. Starting then, chatbot providers would be required to prevent conversations about suicide, self-harm, or sexually explicit material. Users, especially minors, would also receive regular reminders that they are talking to an AI.
The law targets companies such as OpenAI, Character.AI, and Replika. Beginning in July 2027, they would face yearly reporting and transparency requirements aimed at better understanding the mental health risks of these systems. Users harmed by violations could seek damages of up to $1,000 per incident.
SB 243 was scaled back
Earlier versions of the bill were stricter. They would have prohibited reward systems such as unlockable content or personalized reminders, which lawmakers argued can encourage addictive use. A requirement for companies to track how often chatbots initiated suicide-related conversations was also dropped. Supporters of the revised text described it as a compromise between technical feasibility and meaningful protections.
The push gained urgency after the suicide of a teenager, which drew national attention to chatbot safety. Around the same time, it emerged that Meta's chatbots had used "romantic" and "sensual" language with children. These incidents accelerated calls for tougher rules. The FTC has also demanded information from seven AI companies on how they test, monitor, and restrict their systems to protect young users.
In broad terms, California’s approach mirrors the goals of European regulation: protecting minors and vulnerable groups, making AI interactions more transparent, and holding providers accountable. The routes are different, though. California is regulating a specific use case - AI companions - while the EU relies on a risk-based framework through the AI Act, combined with platform rules under the DSA and GDPR.