Edward Snowden has warned against trusting OpenAI and its products, including ChatGPT, after former NSA director Paul Nakasone was appointed to the company's board of directors.
Former NSA contractor and whistleblower Edward Snowden warns against OpenAI after former NSA director Paul Nakasone was appointed to the company's board of directors. Snowden sees this as a "willful, calculated betrayal of the rights of every person on Earth."
"They've gone full mask-off: do not ever trust OpenAI or its products (ChatGPT etc). There is only one reason for appointing an NSAGov Director to your board. This is a willful, calculated betrayal of the rights of every person on Earth. You have been warned," Snowden writes.
A day later, Snowden added that the combination of AI with the vast amount of mass surveillance data accumulated over the past two decades will "put truly terrible powers in the hands of an unaccountable few."
Nakasone, who served as the NSA director from 2018 to February 2024, has joined OpenAI's newly formed Safety and Security Committee. His role is to help strengthen the company's efforts in utilizing AI for cybersecurity purposes. OpenAI also collaborates with the U.S. military on cybersecurity projects.
The announcement followed internal criticism of OpenAI's safety culture, with several safety researchers recently leaving the company. However, these researchers were concerned about the alignment of a potential "super AI" than with day-to-day cybersecurity risks.
Snowden isn't just targeting OpenAI. Speaking at a super AI conference in Singapore, Snowden expressed his issues with the general "AI safety panic."
While he acknowledges that warnings of a Terminator-style superintelligence threatening humanity are well-intentioned, he sees a greater danger in people trying to incorporate their political views into AI models to generalize minority opinions.
Snowden specifically singled out Google, which recently faced criticism for its history-distorting diversity rules in its AI image generator. For example, Google's image generator produced images of dark-skinned people in Nazi-like uniforms due to overly strict diversity rules.
"It's like, don't interfere with the user, you don't know better than the user, don't try to impose your will upon the user, and yet we see that happening," Snowden said.