Ad
Skip to content

Claude users must now opt out to keep their chats out of AI training

Anthropic is rolling out new data privacy controls for Claude. Users on the Free, Pro, and Max plans must now actively opt out if they don't want their conversations used to train AI models.

The new setting only applies to new or ongoing chats and can be changed at any time. If you allow data use, Anthropic will keep your chat data for up to five years to help improve its models and security systems. If you opt out, your conversations are stored for just 30 days. These changes don't affect Claude for Work, Education, Government, or API access through partners like Amazon Bedrock.

Users have until September 28, 2025, to make their choice. After that, you'll have to select a data sharing preference to keep using Claude.

AI News Without the Hype – Curated by Humans

Subscribe to THE DECODER for ad-free reading, a weekly AI newsletter, our exclusive "AI Radar" frontier report six times a year, full archive access, and access to our comment section.

AI news without the hype
Curated by humans.

  • More than 16% discount.
  • Read without distractions – no Google ads.
  • Access to comments and community discussions.
  • Weekly AI newsletter.
  • 6 times a year: “AI Radar” – deep dives on key AI topics.
  • Up to 25 % off on KI Pro online events.
  • Access to our full ten-year archive.
  • Get the latest AI news from The Decoder.
Subscribe to The Decoder