Anthropic is rolling out new data privacy controls for Claude. Users on the Free, Pro, and Max plans must now actively opt out if they don't want their conversations used to train AI models.
The new setting only applies to new or ongoing chats and can be changed at any time. If you allow data use, Anthropic will keep your chat data for up to five years to help improve its models and security systems. If you opt out, your conversations are stored for just 30 days. These changes don't affect Claude for Work, Education, Government, or API access through partners like Amazon Bedrock.
Users have until September 28, 2025, to make their choice. After that, you'll have to select a data sharing preference to keep using Claude.