Microsoft takes the heat for its users in AI-related copyright infringement cases
Key Points
- Numerous copyright lawsuits have been filed against Google, Microsoft, and OpenAI's generative AI systems. Microsoft wants to protect its customers in case of defeat.
- Microsoft President Brad Smith offers a legal safety guarantee: Customers who use Microsoft's Copilot products as intended will be supported by Microsoft if they come under legal attack.
- Microsoft's safety guarantee applies to Github Copilot, Bing Chat Enterprise, and Microsoft 365 Copilot. Microsoft uses filters in its products to protect copyrights.
Generative AI systems from Google, Microsoft, OpenAI, and others are the subject of numerous copyright lawsuits. Microsoft wants to protect its customers in the event of legal issues.
In a blog post, Microsoft President Brad Smith offers a safety guarantee: Customers of Microsoft's Copilot products can use the output as they see fit. If they are "challenged on copyright grounds," Microsoft would take responsibility for any legal consequences.
"If a third party sues a commercial customer for copyright infringement for using Microsoft’s Copilots or the output they generate, we will defend the customer and pay the amount of any adverse judgments or settlements that result from the lawsuit, as long as the customer used the guardrails and content filters we have built into our products," Smith writes.
In addition to Github Copilot, that includes Microsoft's commercial Copilot service, Bing Chat Enterprise, and Microsoft 365 Copilot, which brings generative AI to Word, Excel, PowerPoint, and more.
Microsoft trusts its filters
According to Smith, Microsoft is responsible if a customer uses a Microsoft AI product as intended and gets into legal trouble as a result. He likens the new AI commitment to helping customers who are sued for patent infringement related to Microsoft products.
At the same time, Smith acknowledges the needs of authors. Generative AI raises new questions about copyright, he said. It could bring progress to humanity, but at the same time, authors must have control over their rights and be able to make a living from their work. Training data should not be in the hands of a few companies that could use it to stifle competition.
Copilot systems include guardrails and content filters to protect authors' copyrights, Smith said. Classifiers, metaprompts, content filters, and operational monitoring and abuse detection would reduce the likelihood that a Copilot would generate copyright-infringing content.
"AI raises legal questions that our industry will need to work through with a wide array of stakeholders. This step represents a pledge to our customers that the copyright liability of our products is ours to shoulder, not theirs," Smith writes.
AI News Without the Hype – Curated by Humans
As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.
Subscribe now