Anthropic is rolling out new data privacy controls for Claude. Users on the Free, Pro, and Max plans must now actively opt out if they don't want their conversations used to train AI models.

Ad

The new setting only applies to new or ongoing chats and can be changed at any time. If you allow data use, Anthropic will keep your chat data for up to five years to help improve its models and security systems. If you opt out, your conversations are stored for just 30 days. These changes don't affect Claude for Work, Education, Government, or API access through partners like Amazon Bedrock.

Users have until September 28, 2025, to make their choice. After that, you'll have to select a data sharing preference to keep using Claude.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Sources
Matthias is the co-founder and publisher of THE DECODER, exploring how AI is fundamentally changing the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.