Ad
Skip to content

Claude users must now opt out to keep their chats out of AI training

Anthropic is rolling out new data privacy controls for Claude. Users on the Free, Pro, and Max plans must now actively opt out if they don't want their conversations used to train AI models.

The new setting only applies to new or ongoing chats and can be changed at any time. If you allow data use, Anthropic will keep your chat data for up to five years to help improve its models and security systems. If you opt out, your conversations are stored for just 30 days. These changes don't affect Claude for Work, Education, Government, or API access through partners like Amazon Bedrock.

Users have until September 28, 2025, to make their choice. After that, you'll have to select a data sharing preference to keep using Claude.

Ad
DEC_D_Incontent-1

AI News Without the Hype – Curated by Humans

As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.

Source: Anthropic