Meta CTO Andrew Bosworth does not believe in watermarking systems for AI-generated media. At the same time, he believes that people will get used to deepfakes and AI-generated content.
In an interview with The Verge, Bosworth expresses skepticism about watermarking as a solution to the problem of murky AI content or fakes.
According to Bosworth, there are very few digital things that cannot be reproduced digitally. Watermarks that prove that content is authentic or generated by AI could be dangerously subverted in either case, creating new risks.
But Meta wants to take responsibility for its role on the Internet, Bosworth makes clear. The latest image generator announced by Meta uses watermarking technology, for example.
OpenAI CEO Sam Altman is also skeptical of watermarking as an authentication method for or against AI content. OpenAI recently took its AI text detector offline due to a lack of accuracy.
A brief period of authentic multimedia
According to Bosworth, society has experienced a unique period in history when photos and videos were almost certainly real. This period lasted for about the last 50 years.
Before that, written and oral accounts of events were suspect. The same may be true in the future, but it will also apply to visual information.
Bosworth believes that society has faced and overcome similar challenges in the past. People would get used to fake media and adapt to the new reality.
"From today forward, it seems very likely that the youth of our world will know that all accounts that they come across, whether it be textual, oral, or video or photographic are suspect," says Bosworth.
The last 50 years or so would then be the historical outlier, in which it was strangely cheaper to capture real images than to generate artificial ones.
Ian Goodfellow, the inventor of GAN technology that was the driving force behind the first deepfakes, made a nearly identical comment in 2017. The last few decades have been "a little bit of a fluke, historically," in terms of the authenticity of audiovisual information, Goodfellow said. In the future, people will need to be more skeptical.