Content
summary Summary

The race to create more powerful artificial intelligence is "a race to the bottom", according to Max Tegmark, a professor of physics at the Massachusetts Institute of Technology and co-founder of the Future of Life Institute.

Ad

Tegmark, who in March garnered more than 30,000 signatures for an open letter calling for a six-month pause in the development of robust AI systems, is now expressing concern about tech companies' reluctance to pause.

Citing intense competition, the physicist highlights the dangers of this runaway race. Despite the lack of pause, Tegmark considers the letter a success, insisting that it has led to mainstream acceptance of the potential dangers of AI.

According to Tegmark, it's necessary to stop further development until globally accepted safety standards are established. The professor calls for strict control over open-source AI models, arguing that sharing them is akin to distributing blueprints for biological weapons.

Ad
Ad

"Making models more powerful than what we have now, that has to be put on pause until they can meet agreed-upon safety standards. Agreeing on what the safety standards are will naturally cause the pause," Tegmark tells The Guardian.

2023's most famous letter yet

In late March, the Future of Life Institute published an open letter signed by prominent business leaders and scientists, including Elon Musk and Steve Wozniak, calling for a six-month pause in the development of AI systems more powerful than GPT-4.

The letter, addressed to "all AI labs," particularly Microsoft and OpenAI, cites a lack of planning and management in the deployment of AI technology and highlights potential risks, including AI propaganda, job automation, and loss of control.

The Institute recommended using the pause to develop safety guidelines that have been overlooked by independent experts, and to coordinate with lawmakers for better oversight and control of AI systems.

The letter garnered a lot of attention and certainly sparked discussion, but ultimately didn't lead to any serious plans for a pause. Google is releasing its next-generation Gemini model this fall, OpenAI is planning visual upgrades for GPT-4, and it just introduced DALL-E 3.

Recommendation

Tegmark and researcher Steve Omohundro recently proposed the use of mathematical proof and formal verification as a means of ensuring the safety and controllability of advanced AI systems, including AGI.

Designing AI so that critical behaviors are provably consistent with formal specifications that encode human values and preferences could help provide a safety net. The authors acknowledge the technical obstacles, but are optimistic that recent advances in machine learning will enable advances in automated theorem proving.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Max Tegmark, professor of physics at MIT and co-founder of the Future of Life Institute, continues to push for a pause in the development of powerful AI systems, arguing that the relentless competition in the field could lead to unforeseen dangers. The proposed pause aims to establish globally agreed safety standards for further AI development.
  • Earlier this year, Tegmark and other prominent figures, including Tesla's Elon Musk and Apple co-founder Steve Wozniak, collected more than 30,000 signatures on an open letter calling for a six-month pause in AI development.
  • The letter highlighted potential risks such as AI propaganda, job automation, and loss of control, and suggested using the pause to work with lawmakers and experts to develop oversight and safety guidelines.
Sources
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.