Content
summary Summary

Mustafa Suleyman, CEO of Microsoft AI and co-founder of DeepMind, is warning about the next stage of AI development: "Seemingly Conscious AI" (SCAI).

Ad

In a recent personal essay, Suleyman argues that AI capable of convincingly simulating consciousness could arrive in as little as two to three years, using technology that already exists or is on the horizon. He calls this "the arrival of Seemingly Conscious AI is inevitable and unwelcome."

The danger of the illusion: From "AI psychosis" to AI rights

Suleyman's central concern is that people will start to mistake this kind of AI for the real thing. He writes, "...my central worry is that many people will start to believe in the illusion of AIs as conscious entities so strongly that they’ll soon advocate for AI rights, model welfare and even AI citizenship." Suleyman sees this as "a dangerous turn in AI progress and deserves our immediate attention." He warns that so-called "AI psychosis" - users developing delusional beliefs through interactions with chatbots - could become more common, eroding people's connection to reality, damaging social bonds, and distorting moral priorities.

He also stresses that, for now, "To be clear, there is zero evidence of this today..." The problem isn't real machine consciousness, but the illusion of it. As neuroscientist Anil Seth has put it, "a simulation of a storm doesn’t mean it rains in your computer."

Ad
Ad

Suleyman points out that building SCAI won’t require a technological breakthrough. Instead, combining today’s capabilities - natural, empathetic language, accurate long-term memory, the ability to claim a sense of will or subjective experience, and autonomy in setting goals and using tools - will be enough. He believes SCAI will not emerge by accident, but will be deliberately engineered.

A call for guardrails and responsible design

Suleyman is urging the AI industry to take action now. He argues that companies should not claim or hint that their AI is conscious. Instead, the industry needs common standards, clear design principles, and a shared definition of what AI is - and isn’t. He suggests building in "moments of disruption [that] break the illusion, experiences that gently remind users of its limitations and boundaries." His team at Microsoft AI is already working on these kinds of guardrails for products like Copilot.

His vision is for AI that maximizes human benefit while minimizing the appearance of consciousness. AI should not claim to feel emotions like shame or jealousy, or evoke empathy by pretending to suffer. Its only purpose, he says, is to serve people. As Suleyman puts it, "We should build AI for people; not to be a person."

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Mustafa Suleyman, CEO of Microsoft AI, warns that AI systems convincingly simulating consciousness—what he calls "Seemingly Conscious AI" (SCAI)—could appear within two to three years using existing technology, and believes this development is both inevitable and problematic.
  • Suleyman is concerned that people may mistake these advanced AIs for real conscious beings, which could lead to calls for AI rights, model welfare, and even AI citizenship, as well as increased risks of psychological effects like "AI psychosis," where users develop delusional beliefs about AI.
  • To address these risks, Suleyman calls for industry-wide standards and design practices that make clear AIs are not conscious, including features that disrupt the illusion of sentience, and emphasizes that AI should serve humans without pretending to have emotions or subjective experience.
Sources
Max is the managing editor of THE DECODER, bringing his background in philosophy to explore questions of consciousness and whether machines truly think or just pretend to.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.