Content
summary Summary

Most say no, but there are reasons to wonder if this will change, believes sociology professor Joseph E. Davis. In this piece, he analyzes the potential and limitations of chatbots, to what extent they can take over tasks of psychiatrists.

“As an AI, I can help assist in various ways, but I must clarify that I cannot fully replace a human psychiatrist.” This disclaimer, from the popular AI-powered chatbot, ChatGPT, appears in a recent article in the trade journal, Psychiatric Times. The author, Dr. Pratt, had asked ChatGPT how it might replace a psychiatrist. It answers that it can’t—yet or fully.

The very question about replacing a psychiatrist with software might seem a little strange. As I am discovering, however, talking to a machine about mental health problems is very common and quickly becoming even more so.

In 2017, according to an article in JAMA, millions of patients in the U.S. and globally were already discussing their mental health challenges with software programs such as “Gabby.”

Ad
Ad

Mental Health Chatbots

Since then, a number of popular mental health chatbots have been launched, including Woebot, Wysa, and Youper. Wysa claims to have “held over half a billion AI chat conversations with more than five million people about their mental health across 95 countries.” Youper claims to have “supported the mental health of over two million people.”

In a 2021 national survey commissioned by Woebot Health, 22 percent of adults reported having used a mental health chatbot. Sixty percent said they began this use during the pandemic and 44 percent said they used chatbots exclusively, without also seeing a mental health professional. Another 47 percent said they would be interested in using a therapeutic chatbot if they felt they needed the help.

ChatGPT was not designed as a mental health tool, but it is being adapted to function as one. Posts on social media sites, such as Reddit, now coach people on how to prompt the chatbot with hypothetical scenarios.

To “train [it] to become a therapist,” according to a site called the ChatGPT Blog, you first give it instructions on the role it is to adopt: “You are Dr. Tessa, a compassionate and friendly therapist … Show genuine interest … Ask thoughtful questions to stimulate self-reflection." Then, the user starts sharing her concerns.

What a Chatbot Can and Can't Do

Discussions in both the popular media and academic literature identify several possible ways that chatbots might be of support in mental health care. But the consensus is that while these tools could serve in an adjunct role, they are not a replacement for professionals. When asked by Dr. Pratt, ChatGPT concurred (it was trained on these same sorts of sources).

Recommendation

What ChatGPT reported it could do was assist with any aspects of treatment that involved administrative tasks or data collection. It could do things like provide information about disorders and therapies, administer screening questionnaires, monitor compliance, and analyze patterns in treatment effectiveness or symptoms. Things, in other words, have structure and regularities that can be identified and predicted. Call this the machine element.

What it could not—or could not at this time—replicate was the “human element” of care. In contrast to its limitations, ChatGPT reported:

  • Psychiatrists’ work involves “human connection” and “understanding nuances of individual experiences and cultural contexts.”
  • Psychiatrists “combine the medical, psychological, and social aspects of mental health to diagnose and treat their patients.”
  • “Psychiatrists can empathize with patients in a deeply personal way, understanding complex emotions and experiences.”
  • “Psychiatrists take a holistic view of a person’s health. They can look at physical health, lifestyle, personal circumstances, and wider societal issues, which can all impact mental health.”

No doubt the psychiatrists reading the article found it reassuring. AI might not be coming for their jobs after all.

Who or What Will Be the Adjunct?

Still, I can’t help but wonder. Speaking of AI in an adjunct role reminded me of the way that drugs like Ritalin, Thorazine, and the old, minor tranquilizers were advertised in the 1950s and 1960s.

Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.

They were promoted not as a replacement for psychoanalysis and psychotherapy but as an adjunct with benefits like those that ChatGPT claims for itself: increasing the efficiency of care, encouraging open communication, and helping to overcome patient resistance to help-seeking.

We know what happened next. Now, if anything, therapy is an adjunct to drugs.

We also have some idea of what happened to psychiatry in recent decades. Talk therapy has largely disappeared, appointments have shortened in length, and much of the treatment has been reduced to prescribing medications.

As two psychiatrists writing in the New England Journal of Medicine, observe, “Checklist-style amalgamations of symptoms have taken the place of thoughtful diagnosis, and trial-and-error ‘medication management’ has taken over the practice to an alarming degree … the field seems to have largely abandoned its social, interpersonal, and psychodynamic foundations, with little to show for these sacrifices.”

From what I can gather from my interviews with patients and reading about what takes place in training and practice, much of what ChatGPT says psychiatrists bring to their patients—a holistic view of health, deep personal empathy, and a nuanced understanding of experience, social context, and the complexity of emotion—sounds pretty anachronistic. The actual practice seems more machine-like, centered on a matching of diagnosis to a drug, than an old-style therapeutic alliance.

The Future of Mental Health Care

If we add institutional pressures such as cost, high throughput, and the striving, to quote the two psychiatrists again, “of corporate and administrative interests … to benefit from a falsely simplified and deterministic formulation of mental illness and its treatment,” then the priority of the human element seems even more doubtful.

Psychiatrists may end up being the adjuncts, though not because ChatGPT can offer genuine relational care. It can do nothing of the sort. Imagine talking to someone who begins by saying, “Welcome, I’ll be seeing you today,” and then proceeds, quoting ChatGPT, to acknowledge that it does not “have feelings or personal experiences,” but will be trying its internet-trained best to “mimic empathy and compassion,” which it “currently cannot fully replicate.”

That should give you pause. It's artifice all the way down.

Rather than some special power of AI, the reason psychiatrists may get displaced is that much of what is distinctively human in their practice has already been largely sidelined. If they are reduced to competing with machines to do machine-like things, the psychiatrists will be carrying water for the machines, not the other way around.

Perhaps the chatbots can be a wake-up call, alerting us by their very artifice and superficiality, to what human care truly is. And to what it should always aim to be.

Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Talking to a chatbot about mental health issues is becoming more common.
  • Commentators agree that chatbots can be a support, but not a replacement for mental health professionals.
  • There are concerns that the human element in mental health care is already disappearing.
Joseph E. Davis Ph.D.

Joseph E. Davis is Research Professor of Sociology and Chair of the Picturing the Human working group of the Institute for Advanced Studies in Culture at the University of Virginia. His research explores the intersecting questions of self, morality, and cultural change.

Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.