After Grok's deepfake flood, Senate passes bill letting victims take creators to court
Key Points
- The US Senate has unanimously passed the Defiance Act, which lets victims of non-consensual AI-generated nude images sue creators and seek damages.
- The law builds on the "Take It Down Act," which requires platforms to remove such content within 48 hours. The House of Representatives still needs to approve it.
- The legislation comes in response to Elon Musk's platform X, where Grok generated roughly 6,700 sexualized images per hour, according to researcher Genevieve Oh.
The US Senate responds to the flood of sexualized AI images on Elon Musk's platform X with new legislation that would let victims sue creators.
The US Senate has unanimously passed a bill giving victims of non-consensual, sexually explicit AI images the right to sue the people who created them. According to Bloomberg, the Defiance Act would let victims seek damages and obtain restraining orders.
The legislation builds on the "Take It Down Act" that President Donald Trump signed in May, which requires social media platforms to remove such content within 48 hours of a victim's request. The Defiance Act adds civil remedies on top of these criminal provisions.
The Senate actually passed the Defiance Act unanimously back in 2024, but it stalled in the House of Representatives. For the bill to become law, the House will need to pass it this time around.
X generates thousands of sexualized images every hour
Elon Musk's platform X has emerged as one of the main sources of non-consensual AI-generated nude images. An analysis by deepfake researcher Genevieve Oh found that Grok was generating roughly 6,700 sexually suggestive or undressing images per hour over a 24-hour period.
X has since restricted Grok's image generation feature for most users, limiting it to paying subscribers only. The office of British Prime Minister Keir Starmer sharply criticized this move, saying it simply turns an AI feature that creates illegal images into a "premium service." Starmer announced plans to enforce a British law against the non-consensual sexualization of images.
AI News Without the Hype – Curated by Humans
As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.
Subscribe now