Content
summary Summary

A neuroscience writer caught Google's AI search presenting a completely fabricated medical condition as scientific fact. The incident raises questions about AI systems spreading misinformation while sounding authoritative.

Ad

The user "Neuroskeptic" discovered Google's AI describing "Kyloren syndrome" as a real medical condition. The twist? Neuroskeptic had invented this fake syndrome seven years ago as a joke to expose flaws in scientific publishing.

The AI didn't just mention the condition - it provided detailed medical information, describing how this non-existent syndrome passes from mothers to children through mitochondrial DNA mutations. All of this information was completely made up.

Screenshot of a tweet and Google search result for the fictitious “Kyloren Syndrome” disease, presented as an AI-generated fact.
A fictional disease invented as a joke is considered a real medical condition by Google's AI Overview. | Image: Neuroskeptic via Bluesky

"I'd honestly have thought twice about doing the hoax if I'd known I might be contaminating AI databases, but this was 2017. I thought it would just be a fun way to highlight a problem," Neuroskeptic said.

Ad
Ad

AI overview cites sources it hasn't read

While "Kyloren Syndrome" is an unusual search term, this case reveals a concerning pattern with AI search tools: they often present incorrect information with complete confidence. A regular Google search immediately shows the paper was satirical, yet the AI missed this obvious red flag. It shows that context often matters.

Academic paper title page showing template format with placeholder authors and Mountainview University affiliation.
Image: Screenshot THE DECODER

But Google's AI model Gemini, which creates these search overviews, completely missed this crucial context - despite citing the very paper that would have exposed the joke. The AI referenced the source without actually understanding what it contained.

Google search result showing PDF preview of NMJS paper mentioning Kyloren syndrome alongside real medical conditions.
Image: Screenshot THE DECODER
Dark mode screenshot of social media comment section discussing mitochondrial DNA with like/dislike buttons visible. The NMJS paper is visible as the source on the right side of the screen.
Image: Screenshot THE DECODER

In fairness, not all AI search tools fell for the fake condition. Perplexity avoided citing the bogus paper entirely, though it did veer off into a discussion about Star Wars character Kylo Ren's potential psychological issues.

ChatGPT's search proved more discerning, noting that "Kyloren syndrome" appears "in a satirical context within a parody article titled 'Mitochondria: Structure, Function and Clinical Relevance.'"

AI search companies stay quiet about error rates

Google's incident adds to concerns about AI search services making things up while sounding authoritative. When asked about concrete error rates in their AI search results, Google, Perplexity, OpenAI, and Microsoft have all stayed silent. They didn't even confirm whether they systematically track these errors, even though doing so would help users understand the technology's limitations.

Recommendation

This lack of transparency creates a real problem. Users won't spend time fact-checking every AI response - that would defeat the purpose. If people have to double-check everything, they might as well use regular search, which is often more reliable and faster. But this reality doesn't square with some people's claims that AI-powered search is the future of the Web.

The incident also raises questions about who's responsible when AI systems spread misinformation that could potentially harm people, as happened recently with Microsoft Copilot talking about court reporter Martin Bernklau. So far, companies running these AI systems haven't addressed these concerns.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • A person who invented a fictional disease called "Kyloren's syndrome" years ago as a satire discovered that Google's AI search was presenting it as a real medical condition.
  • While the AI search ignored evidence of the satirical origin of the syndrome, a regular Google search returned a PDF document clearly showing it was a fake scientific paper, which allegedly served as the source for the AI answer.
  • Google, Perplexity, OpenAI, and Microsoft have not yet provided any information on specific error rates in their AI-generated search results.
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.