Microsoft releases guidance to stop Copilot AI from oversharing sensitive data
Microsoft has released guidance for system administrators to prevent its Copilot AI from accessing and sharing too much information. The problem occurs when Copilot can see data beyond a user's intended access level, which can result in unwanted data exposure or responses containing information users shouldn't see. The guidance suggests admins should first identify SharePoint sites with lower security risks to test the AI's behavior in a more secure environment. They should then remove sensitive content from Copilot's reach. Finally, they should strengthen privacy by restricting access to pages to specific team members.
AI News Without the Hype – Curated by Humans
As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.
Subscribe nowAI news without the hype
Curated by humans.
- Over 20 percent launch discount.
- Read without distractions – no Google ads.
- Access to comments and community discussions.
- Weekly AI newsletter.
- 6 times a year: “AI Radar” – deep dives on key AI topics.
- Up to 25 % off on KI Pro online events.
- Access to our full ten-year archive.
- Get the latest AI news from The Decoder.