Two Miami teenagers have been charged with creating AI-generated nude images of classmates and sharing them without their consent.
Two teenagers aged 13 and 14 from Miami, Florida were arrested on December 22, 2023 for allegedly creating AI-generated nude images of male and female classmates, ages 12 and 13, and sharing them without their consent, reports WIRED, citing a police report.
According to the report, the teens allegedly used an "AI app" to create the fake explicit images. The name of the app is not mentioned in the report.
The teenagers are facing third-degree felony charges, comparable to car theft or false imprisonment. The charge stems from a law passed in Florida in 2022 to curb harassment through deepfake images.
So far, neither the parents of the accused boys nor the investigator and prosecutor in charge have commented on the case.
Similar cases have come to light in the US and Europe. However, the Florida case is believed to be the first known arrest and prosecution for allegedly AI-generated nude images.
AI image generators lead to more child abuse
The naked bodies depicted in AI fake images are not images of real bodies, but replicas created by artificial intelligence. Nevertheless, these images can appear authentic, which can lead to psychological distress and reputational damage.
The White House recently responded to AI-generated nude photos of Taylor Swift: The U.S. government called this and similar incidents "alarming" and said new laws are needed.
In late October 2023, the Internet Watch Foundation (IWF) reported that AI image generators are also leading to an increase in child sexual abuse material (CSAM).
The National Center for Missing and Exploited Children in the US reported a "sharp increase" in AI-generated abuse images by the end of June 2023. The images would complicate investigations and could hinder the identification of victims.