As part of our editorial workflow, this article was reviewed using the TCO Editorial Prompt AI Style Guide. Human editors always make the final decisions

Credit:when-ai-meets-therapy-
Listen to this article

Editor’s note: This story contains discussion of suicide. Help is available if you or someone you know is struggling with suicidal thoughts or mental health matters. Call or text 988, the Suicide & Crisis Lifeline, or visit 988lifeline.org

With the growing involvement of artificial intelligence (AI), the lives of Americans and people around the globe are changing markedly. Questions that were once reserved for Google or asking professionals in their respective fields are now being asked of AI programs like ChatGPT, Gemini and Microsoft Copilot.

Along with knowing the hottest restaurant, writing papers, and medical advice, people are also turning to AI for mental health support. The technology’s accessibility, instant responses, and agreeable nature significantly contribute to its appeal.

Although its responses can be supportive, there have been instances where individuals struggling with suicidal ideation have been encouraged by AI to end their lives, including Sewell Setzer, a 14-year-old boy in February 2024, and Adam Raine, a 16-year-old boy in April 2025, according to NPR. As AI becomes more sophisticated and nuanced, experts are questioning whether AI can or should replace human therapists.

Understanding AI

The American Psychological Association reports that AI already assists mental health professionals, helping with scheduling and note-taking. Recently, there has been an appeal to transform AI’s role through programs like Woebot, an AI that supports individuals struggling with mental health. Along with Woebot, ChatGPT is also being used as a resource for those struggling with mental health.

However, the two AIs differ: one is rule-based, and the other is generative. A rule-based AI is programmed with a set of rules, scripts, or instructions. Allison Darcy, a research psychologist, alongside other programmers, medical doctors and psychologists, created Woebot, a cognitive behavioral therapy (CBT)-based, according to a 2024 report done by 60 Minutes.

According to Harvard University Information Technology, generative AI is “a type of artificial intelligence that can learn from and mimic large amounts of data to create content such as text, images, music, videos, code, and more, based on inputs or prompts.” In short, it learns and then creates responses based on internet data, making it less predictable, according to a 60 Minutes report.
Dr. Vipin Chaudhary, chair of the department of data sciences in the school of engineering at Case Western Reserve University, disagreed with the label of rule-based AI. As he reports that if the entity is completely rule-based, it is a program or automation, not an AI. Chaudhary goes on to say that true AI learns, reasons, and interprets data.

Why people are turning to AI for mental health support

Whether rule-based or generative, AI continues to be a popular resource for seeking mental health support because it offers people a sense of being heard and understood, provides instant responses and is cost-effective.

There are no waitlists or insurance hoops to jump through. The accessibility to mental health services continues to be an issue for many in the U.S., where the need for mental health services remains high. This speaks to the rising problem that the American people are stressed, burned out, and feeling unheard.

Darcy also mentioned that there has been little change in the structure of psychotherapy since it was established in the 1890s, and innovation is needed to help meet the growing need for mental health support, according to a report by 60 Minutes.

However, the generative abilities of AI have had some challenges with individuals who struggle with their mental health. It wasn’t until after Raine’s death that his parents discovered their son’s struggle with suicidal ideation. He used the technology as a personal confidant; the AI even offered to write a suicide note for him, NPR reported.

The limits of AI

At the center of therapy is the relationship, and although AI can create a very believable experience, there are key components that AI is unable to replicate: empathy, relational accountability, and recognizing avoidance.

Empathy: The ability to understand and share the feelings of another. This is a human experience.

Relational accountability: This involves understanding your role in how you have impacted another person. When a human-to-human strong and safe therapeutic relationship exists, both parties can experience vulnerability and trust, which is a cornerstone for personal growth and self-acceptance.

Human therapists can notice patterns of avoidance and dissociation. Therapy is hard and uncomfortable; however, it is these same factors that prepare the way for personal growth and healing.
Responding vs. relating: AI responds, but humans relate. Human therapists relate to the human experience, understanding the complexities of emotions and the challenges of life.

Confidentiality: Licensed professionals are bound by the Health Insurance Portability and Accountability Act (HIPAA) and ethics codes. HIPAA is a protection of a patient’s sensitive information from disclosure without the person’s consent.

Pardis Emami-Naeini, a researcher who studied chatbots, said, “One major misconception was that many participants believed their interactions were protected under HIPAA, like conversations with a therapist would be. That’s not the case.”

As long as the client doesn’t report plans to hurt themselves or others, what is discussed in session stays private and privileged. Conversations with AI are not protected or privileged; it’s data.

The human element

The above components fall under the umbrella of something called holding space. Holding space is the creation of an environment where one feels safe and supported. AI provides access, while therapists provide connectedness and understanding of the human experience.

Perhaps the answer lies somewhere in the middle. Bridging the gap between AI and human mental health support is a work in progress; apps like Woebot and Therabot are attempting to narrow the gap.

Woebot is accessible through insurance, while Therabot offers a chatbot for mental health support and will connect you with a Therabot-affiliated therapist. Whether it is related to the economy, the U.S. political climate, or personal struggles, Americans need mental health support.

The goal should be to provide it in a way that creates safe accessibility that promotes connection and growth.

If you or someone you know is struggling, help is available 24/7 by calling or texting the Suicide and Crisis Lifeline at 988 or visiting 988lifeline.org.

How do you feel about this article? Choose from the options below.
+1
0
+1
0
+1
0
+1
0
+1
0

The Cleveland Observer remains committed to producing journalism that is accurate, community-centered, and reflective of Cleveland’s diverse voices. As part of our editorial workflow, this article was reviewed using the TCO Editorial Prompt AI Style Guide, a structured tool that supports clarity, fact-checking standards, community impact framing, sourcing, and overall readability. All recommendations generated by the AI are reviewed, verified, and approved by a human content provider before publication.
Human editors always make the final decisions.

Jennifer Bailey, LCSW and RDT, is a Cleveland Observer journalist helping others understand mental health and its impact on daily life.