Content
summary Summary

AI systems like ChatGPT are taking on roles once reserved for doctors—not because they are more medically capable, but because they are not constrained by time, bureaucracy, or exhaustion.

Ad

When journalist Kate Pickert was diagnosed with breast cancer ten years ago, she had numerous questions—about test results, side effects, and new clinical studies. Her oncologist often responded promptly via email, but sometimes Pickert received only an out-of-office reply: on call, traveling, unavailable for days. During a period when every hour felt critical, such delays felt interminable.

Looking back, Pickert, now a journalism professor, wonders what might have been different if tools like ChatGPT had existed at the time—a system available 24/7, able to explain medical terminology, summarize research findings, and do so with a tone of compassion. That question forms the basis of an essay she recently published on Bloomberg.

Her central argument is that artificial intelligence in healthcare has so far been seen primarily as a tool for improving efficiency—supporting diagnosis, automating documentation, and analyzing data. But its most significant benefit may lie elsewhere: in the human connection.

Ad
Ad

One example is Rachel Stoll, a patient living with the rare Cushing's syndrome. After a frustrating medical encounter, she turned to ChatGPT for the first time—and was surprised by what she found. The system responded not only with accurate medical information but with empathetic phrases like “That must be frustrating” and “I’m sorry to hear that.” No time limit, no digressions, no impatience.

Offering emotionally intelligent digital conversations

The idea that language models can sometimes appear more empathetic than human doctors has been supported by recent research. A study from New York University found that patients rated chatbot responses as more empathetic than those written by physicians. The reasons appear to be structural: time constraints, administrative burdens, and high workloads often prevent empathy from surfacing in daily clinical routines.

These differences aren’t just theoretical. Dr. Jonathan Chen of Stanford University tested ChatGPT with an ethical dilemma from his practice: a patient with dementia could no longer swallow—should a feeding tube be inserted or not? The system’s response was so nuanced and compassionate that, as Chen put it, “Holy crap, this chatbot is providing better counseling than I did in real life.” For him, it became a learning opportunity—a low-risk environment to practice difficult conversations.

Such applications may become increasingly valuable in medical education. Instead of hiring actors to simulate patient interactions, students could use AI-powered simulations to practice engaging with various personality types—from anxious parents to withdrawn cancer patients. Bernard Chang, a medical educator at Harvard, sees this as a way to strengthen the emotional skills of future physicians.

Supporting more human care through machines

The paradox at the center of Pickert’s essay is that more automation could help make healthcare more human. By offloading routine tasks—such as taking notes during consultations or drafting medical reports—AI could give clinicians more time for what patients need most: attention and care.

Recommendation

AI is already used in diagnostics, such as interpreting X-rays. But its potential as a communication tool remains underappreciated—even though this is precisely where improvement is most needed. When a machine gently asks what a human forgot, the problem lies not with the AI—but with the system that made it necessary.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • AI systems like ChatGPT are being used to support patients in ways doctors often can't—providing round-the-clock, emotionally aware responses without the limits of time, exhaustion, or bureaucracy.
  • Studies and real-world examples show that patients sometimes perceive chatbot replies as more empathetic than those from physicians, due in part to structural issues like time pressure and administrative workload in healthcare.
  • Experts suggest AI could strengthen medical education and improve patient care by handling routine tasks and offering emotionally intelligent simulations, ultimately allowing clinicians to focus more on human connection.
Sources
Max is the managing editor of THE DECODER, bringing his background in philosophy to explore questions of consciousness and whether machines truly think or just pretend to.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.