Microsoft has released guidance for system administrators to prevent its Copilot AI from accessing and sharing too much information. The problem occurs when Copilot can see data beyond a user's intended access level, which can result in unwanted data exposure or responses containing information users shouldn't see. The guidance suggests admins should first identify SharePoint sites with lower security risks to test the AI's behavior in a more secure environment. They should then remove sensitive content from Copilot's reach. Finally, they should strengthen privacy by restricting access to pages to specific team members.

Ad
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Sources
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.