The UK government is looking to change its laws to let AI companies train their models using online content from creators, unless those creators specifically opt out of the system.
In a BBC interview, Paul McCartney urged the government to rethink this approach and better protect artists' rights. He warned that this could create a "Wild West" environment where creative works lose their copyright protection.
The former Beatle expressed particular concern for emerging artists: "You get young guys, girls, coming up, and they write a beautiful song, and they don't own it."
"The truth is, the money's going somewhere. Somebody's getting paid, so why shouldn't it be the guy who sat down and wrote Yesterday?" McCartney says.
Despite recently working with AI on the final Beatles track "Now and Then," McCartney made his position clear: "I think AI is great, and it can do lots of great things. But it shouldn't rip creative people off. There's no sense in that."
Music industry raises red flags
While the government promises "real control" and transparency for creators, critics say the proposed system puts an unfair burden on artists. They would need to track and object to each AI company individually - a process that typically benefits data collectors more than creators.
Tom Kiehl of UK Music says there's simply "no evidence that creatives can effectively 'opt out' of their work from being trained by AI systems and so this apparent concession does not provide any reassurance to those that work in music."
YouTube's recent approach might show a way forward. Their system lets creators choose which AI companies can use their content, and could eventually lead to systematic payment for training data. For this to work, however, it would require coordination across platforms and countries.
The courts are already getting involved. Major record labels in the U.S. are taking legal action against AI music generators, while Germany's Gema has sued Suno.ai and ChatGPT over song lyric usage.