AI and society

Stability AI Audio Team Leader resigns due to disagreement over fair use

Maximilian Schreiner

DALL-E 3 prompted by THE DECODER

Ed Newton-Rex, head of the audio team at Stability AI, has resigned. He cited a disagreement over the application of 'fair use' when training generative AI models.

In a public statement, Newton-Rex expressed his concerns: "I don’t agree with the company’s opinion that training generative AI models on copyrighted works is ‘fair use’."

He acknowledged the thoughtful approach of many at Stability AI and praised the company for its support in developing Stable Audio, an AI music generator based on licensed training data that shares revenue with rights holders. However, he said that this had not changed the prevailing view within the company about fair use.

US Copyright Office considers fair use in generative AI

The issue of fair use in generative AI came to the forefront when the US Copyright Office recently invited public comments on the subject, and Stability AI was among many companies that responded. In their 23-page submission, Stability AI stated, "We believe that AI development is an acceptable, transformative, and socially-beneficial use of existing content that is protected by fair use."

Fair use is a legal doctrine that allows limited use of copyrighted material without requiring permission from the rights holders. According to Newton-Rex, one factor affecting whether copying is fair use is "the effect of the use upon the potential market for or value of the copyrighted work."

He believes that today's generative AI models can be used to create works that compete with the copyrighted works they are trained on, which challenges the idea that training AI models in this manner can be considered fair use.

AI training without consent is morally wrong

Apart from the fair use argument, Newton-Rex also believes that training generative AI models without permission is morally wrong. He said that companies worth billions of dollars are training AI models on creators' works without permission, potentially undermining the creators' livelihoods.

Despite his disagreement with Stability AI's fair use stance, Newton-Rex remains a supporter of generative AI, having worked in the field for 13 years. However, he emphasizes that he can only support generative AI that does not exploit creators by training models on their work without permission.

Newton-Rex hopes that others within generative AI companies will speak up about the fair use issue and push for a change in how creators are treated in the development of generative AI technology.

High licensing fees could slow generative AI, say to AI companies

In addition to Stability AI, AI companies such as Meta, Google and OpenAI have also submitted comments to the US Copyright Office, arguing that training AI models with copyrighted material is fair use and does not infringe the rights of copyright holders. Meta compared generative AI to a printing press, a camera or a computer, and argued that high licensing fees for AI training data could slow the development of generative AI.

Google and OpenAI argued for a flexible interpretation of fair use and warned against premature legislation that could stifle innovation and limit the potential of AI technology.

Sources: