Content
summary Summary

Using AI to detect and report child abuse material on social media platforms can delay investigations

Ad

Social media companies like Meta are using AI to detect and report suspicious material, but US law enforcement and the National Center for Missing and Exploited Children (NCMEC) can only open these reports with a warrant, reports The Guardian.

The only exception would be if the content has been reviewed by someone at the social media company.

Obtaining a search warrant can take days or weeks, resulting in delays in the investigation and possible loss of evidence.

Ad
Ad

In addition, automatically generated reports often lack the specific information needed to obtain a search warrant.

According to NCMEC, Meta sends by far the largest number of these AI-generated reports. In 2022, more than 27 million, or 84 percent, of all reports were generated through the Facebook, Instagram, and WhatsApp platforms. In total, the organization received 32 million regular and AI-generated reports from various sources.

AI delays child abuse investigations

The reliance on a search warrant leads to a backlog of AI-generated reports that are not followed up, putting pressure on already overburdened law enforcement teams. As a result, many AI-generated leads go unaddressed, according to The Guardian.

The warrant requirement for AI-generated leads is tied to privacy protections in the United States. The Fourth Amendment prohibits unreasonable government searches.

In addition to AI-generated reports, AI-generated images of child abuse can further slow down investigations.

Recommendation

In October 2023, the Internet Watch Foundation (IWF) warned about the growing amount of AI-generated child pornography content. The proliferation of synthetic but photo-realistic images of child abuse poses a "significant threat" to the IWF's mission of removing such content from the Internet.

The nonprofit Thorn Group for Child Safety also fears that the volume of synthetic images will hinder the search for victims and the fight against actual child abuse. The number of images is overwhelming the existing investigative system, which has to figure out which images are real and which are fake. Both organizations report a sharp increase in such images.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • AI systems that detect child abuse material on social media platforms can delay investigations because U.S. law enforcement and NCMEC require a search warrant to open the automatically generated reports.
  • Obtaining a warrant can take days or weeks, resulting in delays in the investigation and possible loss of evidence. In addition, the reports often lack the specific information needed to obtain a warrant.
  • The growing amount of AI-generated child pornography content poses a significant threat to organizations such as the Internet Watch Foundation and the Thorn Group, which seek to remove such content from the web and locate victims.
Sources
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.