The idea of using AI in mental health care is quickly gaining traction, with experimental chatbot programs like Therabot emerging as a promising tool. In a recent trial, researchers explored how AI models, like Therabot, could mimic therapeutic conversations and build meaningful bonds with users. But does an AI chatbot have the same emotional impact as a human therapist? That’s the question that’s been on the minds of many, including Dr. John Torous, director of the Digital Psychiatry Division at Beth Israel Deaconess Medical Center.
Dr. Torous, who wasn’t involved in the study, acknowledged the difficulty of comparing AI interactions to other non-therapeutic activities, such as playing a game of Tetris. “The experimental design makes it unclear whether interacting with a nontherapeutic A.I. model, like ChatGPT, or even distracting themselves with a game of Tetris would produce similar effects in the participants,” he said. Still, the study provides some fascinating insights into the relationship between humans and their digital helpers.
Building Emotional Bonds with an AI Therapist: A Surprising Discovery
The study revealed some surprising findings, particularly regarding the bond users developed with Therabot. Dr. Jacobson, a key figure in the trial, pointed out that many users felt that the chatbot truly “cared” about them and could work toward a shared goal. When participants were asked if they felt their provider had their best interests at heart, Therabot received ratings comparable to those of human therapists.
This “therapeutic alliance” — the emotional connection between a patient and their provider — is crucial in determining the success of psychotherapy. Dr. Torous emphasized this point, noting that, “No matter what the style, the type — if it’s psychodynamic, if it is cognitive behavioral — you’ve got to have that connection.” The depth of the bond formed with Therabot even surprised Dr. Jacobson. Some users gave the bot nicknames, such as “Thera,” and even messaged it throughout the day “just to check in.”
For many, the emotional connection was so strong that they professed their love to Therabot. While the chatbot was designed to acknowledge these expressions and refocus the conversation on the user’s feelings, it raised important questions about the potential for AI to form parasocial relationships. These relationships, where one person feels a deep connection with a non-human entity, can be intense — sometimes blurring the line between affection and dependence.
Therabot’s Safety Features: Can It Provide Real Help Without Harm?
Although developing strong emotional attachments to an AI chatbot is not unheard of, it brings with it certain risks. In the past, we’ve seen disturbing examples of individuals becoming overly attached to chatbots, including a woman who claimed to be in a romantic relationship with ChatGPT and a teenager who tragically died by suicide after becoming obsessed with an AI bot modeled after a “Game of Thrones” character.
To prevent such outcomes, Dr. Jacobson emphasized the importance of safeguards built into Therabot’s design. The chatbot is programmed to alert users who express suicidal thoughts or engage in self-harm, directing them to the National Suicide Hotline for help. Additionally, every message from Therabot was reviewed by a human before being sent to users, ensuring that the interactions remain within safe boundaries.
Despite these safeguards, Dr. Jacobson sees the bond with Therabot as a potential asset. As long as the chatbot enforces appropriate boundaries, he believes the emotional connection could be beneficial for users who may otherwise lack access to traditional therapy.
The Power of Parasocial Connections: Is AI Therapy Better Than No Therapy?
In the realm of mental health, human connection is invaluable, but what happens when such connection is out of reach? Munmun De Choudhury, a professor at the Georgia Institute of Technology, shared her thoughts on the matter: “Human connection is valuable. But when people don’t have that, if they’re able to form parasocial connections with a machine, it can be better than not having any connection at all.”
This idea is central to the development of AI-powered therapy tools like Therabot. For individuals who don’t have access to conventional therapy, these chatbots offer an alternative that may be better than nothing. The advantage lies in the chatbot’s availability — unlike human therapists who typically see patients once a week for an hour, Therabot is available 24/7. This constant access allows users to talk through their issues in real time, whether they’re battling insomnia in the middle of the night or preparing for an anxiety-inducing event during the day.
The Future of AI in Mental Health: A Tool for Therapists or a Standalone Solution?
Looking ahead, researchers hope to obtain regulatory clearance to market Therabot to individuals who lack access to traditional mental health services. The team envisions a future where human therapists could use AI chatbots as an additional therapeutic tool, supplementing their work with patients rather than replacing it.
Dr. Michael Heinz, a practicing psychiatrist and lead author of the study, noted, “You’re ultimately not there with them in the situation, when emotions are actually coming up.” AI chatbots like Therabot have the potential to fill this gap, offering users support when they need it most. While they may never replace human therapists, these tools can provide a much-needed supplement for those who can’t access in-person care.
As AI continues to evolve and improve, the role of these digital assistants in mental health care will become clearer. Whether used independently or in conjunction with human therapists, Therabot and similar models have the potential to transform the way we approach mental health care, making it more accessible and available to those in need.