Content
summary Summary

An AI-generated voice nearly tricked a lawyer's father into paying $30,000 in fake bail money.

Ad

Consumer protection attorney Jay Shooster shared on X that scammers used a voice clone to impersonate him, claiming he'd been arrested after a drunk driving accident.

Shooster believes the scammers may have used a 15-second clip of his voice from a recent TV appearance to create the fake. Even though Shooster had warned his family about such scams, they almost fell for it.

"That's how effective these scams are. Please spread the word to your friends and family," Shooster said. He called for tighter regulation of the AI industry.

Ad
Ad

Study shows humans can't reliably recognize AI voices

A University College London study found that people failed to recognize AI-generated voices 27% of the time, regardless of language. Repeated listening didn't significantly improve detection rates.

This means that, in theory, one in four phone scams involving fake AI voices could be successful. The researchers stress the need for better automated deepfake detectors, as human intuition has its limits.

In a separate experiment, IBM security experts demonstrated "audio-jacking" - using AI to manipulate live phone calls and divert money to fake accounts. The researchers combined speech recognition, text generation, and voice cloning techniques. They warn that as the technology advances, it poses a growing risk to consumers. Future attacks could potentially manipulate live video calls as well.

While voice cloning has some positive uses, such as preserving the voices of people with disabilities, the risks currently seem to outweigh the benefits.

It's not just about fraud. Voice actress Amelia Tyler recently reported her AI-cloned voice was used to read rape pornography without her consent. Tyler, nominated for a BAFTA for her role in "Baldur's Gate 3", heard the clone during a livestream after viewers input text into a generator.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Recommendation
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Consumer advocate Jay Shooster reports an attempted scam in which someone with a cloned voice impersonated him and demanded a $30,000 deposit from his father.
  • Shooster suspects the scammers cloned his voice from a recent television interview. He believes those 15 seconds of voice material were enough to create a usable voice clone.
  • Although he warned his family about these scams, they almost fell for it. Shooster calls for greater regulation of the AI industry to protect consumers from such scams.
Sources
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.