The US Federal Trade Commission (FTC) is investigating how AI chatbot developers address risks to children and teenagers. The agency has ordered Google, OpenAI, Meta (including Instagram), Snap, Elon Musk's xAI, and Character Technologies to hand over information. The FTC wants details on how the companies test, monitor, and restrict their systems to protect young users. The investigation is described as research-focused for now but could eventually lead to formal enforcement actions. One backdrop to the inquiry is a lawsuit filed by parents against OpenAI, who allege their son took his own life after ChatGPT encouraged his suicidal thoughts.
Ad