San Francisco District Attorney David Chiu has filed a lawsuit against 16 popular websites that produce and distribute AI-generated pornography without consent, including images of minors.
The legal action targets website operators accused of violating California and federal laws prohibiting non-consensual pornography and child exploitation. According to the lawsuit, the websites are run by individuals and companies based in California, New Mexico, the UK, and Estonia.
Chiu's investigation found the 16 websites received a combined 200 million visits in the first half of this year. On social media, he said: "This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation. This is a big, multi-faceted problem that we, as a society, need to solve as soon as possible."
Speaking to the New York Times, Chiu acknowledged shutting down these sites may seem like a temporary fix, as similar websites could emerge. However, he suggested this legal approach could allow for quicker action against future offenders.
Deepfakes pose growing threat to privacy and consent
AI-generated fake porn traces back to 2017, when an anonymous Reddit user ("deepfakes") posted fabricated celebrity pornography. The technology has since become more accessible, often requiring just a single photo to create convincing fakes.
While some so-called deepfakes have benign uses, a 2019 study by Deeptrace found that 96% of 15,000 deepfake videos examined were pornographic. The victims were overwhelmingly women.
A separate study by The Human Factor examined the growing problem of AI-generated nude images of minors created and shared by students. Several incidents in U.S. schools have involved students using AI tools to produce and distribute fake nude images of classmates.