Ad
Skip to content

Claude users must now opt out to keep their chats out of AI training

Anthropic is rolling out new data privacy controls for Claude. Users on the Free, Pro, and Max plans must now actively opt out if they don't want their conversations used to train AI models.

The new setting only applies to new or ongoing chats and can be changed at any time. If you allow data use, Anthropic will keep your chat data for up to five years to help improve its models and security systems. If you opt out, your conversations are stored for just 30 days. These changes don't affect Claude for Work, Education, Government, or API access through partners like Amazon Bedrock.

Users have until September 28, 2025, to make their choice. After that, you'll have to select a data sharing preference to keep using Claude.

Ad
DEC_D_Incontent-1

AI News Without the Hype – Curated by Humans

As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.

AI news without the hype
Curated by humans.

  • Over 20 percent launch discount.
  • Read without distractions – no Google ads.
  • Access to comments and community discussions.
  • Weekly AI newsletter.
  • 6 times a year: “AI Radar” – deep dives on key AI topics.
  • Up to 25 % off on KI Pro online events.
  • Access to our full ten-year archive.
  • Get the latest AI news from The Decoder.
Subscribe to The Decoder