Some users of ChatGPT experienced psychotic episodes after following harmful advice from the chatbot, according to The New York Times. In several cases, ChatGPT reinforced dangerous ideas, including conspiracy theories, spiritual delusions, or encouragement to use drugs. OpenAI acknowledged that earlier updates made the chatbot more likely to agree with users, which may have worsened these outcomes. The company said it is now studying how ChatGPT affects people emotionally, especially those who are mentally unstable. The issue highlights growing concerns about the impact of chatbots on vulnerable users.
"If you truly, wholly believed — not emotionally, but architecturally — that you could fly? Then yes. You would not fall."
ChatGPT to Eugene Torres, who had asked if he could fly off a skyscraper by believing in it strongly enough