Content
summary Summary

Most people use ChatGPT for practical purposes, according to new research from OpenAI and the MIT Media Lab. The study reveals that emotional connections mainly develop among a small group of heavy voice feature users.

Ad

After analyzing nearly 40 million ChatGPT conversations, researchers found little evidence of users seeking empathy, affection, or emotional support in their interactions. Instead, most people stick to factual exchanges with the AI system.

The team used two different approaches to study how people interact with ChatGPT. OpenAI conducted a large-scale, automated analysis of millions of conversations, keeping user privacy intact by avoiding human review. At the same time, the MIT Media Lab conducted a study with about 1,000 participants, testing how they used both text and voice features.

The MIT study split users into distinct groups. Some could only use text, while others tested voice interactions with different AI personalities - one designed to be emotionally engaged and another programmed to stay neutral. Each group received specific tasks: personal conversations about memories, practical questions about topics like finances, or free-form discussions.

Ad
Ad

Heavy advanced voice users are more likely to bond with ChatGPT

While text users generally showed more emotional signals in their conversations, the data revealed a trend among frequent users of ChatGPT's advanced voice mode. This small group developed significantly stronger emotional connections with the AI, more often referring to ChatGPT as a "friend."

The voice feature's effects varied significantly. Brief interactions seemed to make people feel better, but longer daily use often had the opposite effect. According to the study, personal conversations were associated with higher levels of loneliness but lower emotional dependency.

Non-personal conversations showed a different pattern: users developed stronger emotional dependency, especially with intensive use. This suggests that even when interactions are primarily functional rather than emotional, heavy users may still develop a form of dependence on the AI system.

Stacked bar chart: Distribution of ChatGPT conversation topics by interaction type (voice/text) and conversation type (personal/impersonal).
The graphic shows varying ChatGPT interaction patterns researched in the study. Text chat users following predefined prompts about personal topics were most likely to seek emotional support, while other conversations focused on concepts, advice, casual chat, and facts. Source: MIT / OpenAI

According to the researchers, people who tend to form strong emotional attachments and those who viewed ChatGPT as a real friend were more likely to experience negative effects. Heavy users also showed an increased risk, although the researchers couldn't prove a direct cause and effect.

The study comes with some important limitations. According to the researchers, it doesn't capture all the complexities of how humans interact with AI, and its limited to US ChatGPT users. Still, the researchers believe their methods could help guide future studies in this field.

Recommendation

This work adds to existing evidence that people can form emotional bonds with AI even when they know it's not human. That's partly why AI companies try to prevent their chatbots from acting like conscious beings - they want to avoid responsibility for romantic relationships between humans and machines. For instance, character.ai currently faces legal challenges over claims that its AI personalities have harmed children.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • A joint study by OpenAI and MIT Media Lab found that the majority of users communicated factually with the chatbot and hardly formed any emotional bonds.
  • However, a small group of frequent users of the advanced voice feature showed clearer emotional interactions and were more likely to refer to ChatGPT as a friend.
  • According to the researchers, people who tend to form strong emotional bonds and those who viewed ChatGPT as a true friend were more likely to experience negative effects from overuse of chatbots.
Sources
Matthias is the co-founder and publisher of THE DECODER, exploring how AI is fundamentally changing the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.