Content
summary Summary

An unknown individual recently impersonated U.S. Secretary of State Marco Rubio, contacting at least five high-ranking officials and dignitaries.

Ad

According to an internal State Department memo obtained by the Washington Post, the attacker used AI-generated voice and text messages to closely mimic Rubio's speech and writing style, apparently in an effort to gain access to sensitive information or digital credentials.

The attacker contacted targets using the encrypted messaging app Signal and text messages, adopting the identity "Marco.Rubio@state.gov" - an address unrelated to Rubio. Three foreign ministers, a U.S. governor, and a member of Congress were contacted through these channels. In at least two cases, voicemails were left; in another, the attacker sent a Signal chat invitation by text.

The memo also notes that other State Department employees were approached via email. The attacker's identity is still unknown. The State Department has launched a full investigation and plans to introduce additional security measures. No details have been released about the contents of the messages, the names of those targeted, or whether the attempts were successful.

Ad
Ad

The FBI previously warned in May about an ongoing campaign using AI-generated voice messages to target senior U.S. officials and their contacts, aiming to steal information or money. The agency advised not to trust messages that appear to come from high-ranking officials without verification. In a similar incident last year, AI-generated robocalls imitating President Biden were used in New Hampshire in an attempt to keep voters away from the primaries.

Weaknesses in Government Communications

Security researcher Hany Farid of the University of California, Berkeley, told the Washington Post that such impersonation attacks do not require advanced technology. Many government employees are lax when it comes to data security, and the use of Signal for official communication is especially concerning.

Audio samples lasting just 15 to 20 seconds are enough to clone a public figure's voice using online services. Attackers simply upload the sample, confirm they have permission to use the voice, and type out the message they want delivered. Voicemail is especially effective for these attacks because it doesn't require real-time interaction.

Open-source tools like Chatterbox are lowering the barrier for voice cloning even further.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • An unidentified attacker impersonated U.S. Secretary of State Marco Rubio using AI-generated voice and text messages, contacting at least five high-ranking officials through encrypted apps and email in an apparent attempt to access sensitive information or credentials.
  • According to a State Department memo, the attacker closely mimicked Rubio's speech and writing style, reaching out to foreign ministers, a governor, and a member of Congress, while the department has now launched an investigation and plans to strengthen security measures.
  • Experts warn that cloning a public figure’s voice requires only brief audio samples and basic online tools, with government employees' lax security practices and the use of Signal for official business raising additional concerns about vulnerability to such attacks.
Sources
Max is the managing editor of THE DECODER, bringing his background in philosophy to explore questions of consciousness and whether machines truly think or just pretend to.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.