Content
summary Summary

A new study indicates that AI researchers largely doubt current artificial intelligence approaches will lead to artificial general intelligence (AGI), even as the technology continues advancing.

Ad

According to the AAAI study on the future of AI research, more than three-quarters of researchers believe scaling existing AI systems is unlikely to produce AGI. The study found that 76 percent of respondents rated this possibility as "unlikely" or "very unlikely."

The research highlights a strong consensus about the role of symbolic intelligence - over 60 percent of researchers believe any system approaching human-like reasoning would need to be at least 50 percent symbolic. However, the field still lacks formal definitions and agreed-upon testing criteria for AGI.

Despite these doubts, researchers strongly favor continuing development. Seventy percent oppose pausing research until safety mechanisms are in place, though 82 percent believe potential AGI systems developed by private companies should be publicly controlled.

Ad
Ad

The scientists see significant potential in specialized AI systems. They point to Google DeepMind's Alphafold as an example of how AI can accelerate science as an expert system in a specific field - which would represent one significant advantage of achieving AGI.

Reasoning models show limits despite recent advances

Despite recent advances in test-time algorithms for reasoning models, the study's researchers identify fundamental shortcomings in current AI architectures. The systems struggle with long-term planning, cannot learn continuously, and lack the structured, episodic memory function that humans possess.

The researchers also point to significant gaps in causal reasoning and real-world interaction. While speech and image processing have seen impressive breakthroughs, the systems still lack deeper understanding of physical reality.

The findings come from the "AAAI 2025 Presidential Panel on the Future of AI Research," led by AAAI President Francesca Rossi. The study brought together 24 experienced AI researchers between summer 2024 and spring 2025, examining 17 different AI research topics including scientific discovery and AGI. A complementary survey of 475 participants, primarily from academia (67%) and North America (53%), provided additional perspective on the field's direction.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • A recent survey of AI researchers reveals that over three-quarters believe scaling current AI methods alone will not lead to Artificial General Intelligence (AGI).
  • The majority of respondents (60%) predict that achieving human-like thinking capabilities will require at least half of the methods to be symbolic in nature.
  • While AI has made significant strides in specific domains like language and image processing, current systems still struggle with essential skills such as long-term planning, continuous learning, and developing a fundamental understanding of the real world.
Sources
Matthias is the co-founder and publisher of THE DECODER, exploring how AI is fundamentally changing the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.