People talk about AI as if it can fix almost anything. Mental health has become its newest target, with promises that technology might finally bridge the huge gap between people who need support and those who can actually get it.

With more than half of U.S. adults facing mental health challenges and many unable to access professional care, the idea of therapy available to anyone, anytime, is understandably appealing. But as a therapist, I’m not convinced we’re ready to treat AI as a stand-in for human care.

Brianna Hardcastle is the clinical director at Tampa's River Oaks Treatment Center. (courtesy, Brianna Hardcastle)Brianna Hardcastle is the clinical director at Tampa’s River Oaks Treatment Center. (courtesy, Brianna Hardcastle)

While the lines between humans and machines are becoming increasingly blurred, the fact remains: Chatbots are not human, and chatbot therapy cannot effectively replicate the human experience.

An AI system is built to transact rather than explore. Without the organic give-and-take of a real therapeutic exchange, patients lose the sense that someone is not merely interpreting your words but genuinely trying to understand you.

Curiosity is an essential part of a therapist’s role. Even the most sophisticated AI can’t reproduce the human instinct to linger on a detail, to ask a question that wasn’t in the script, to follow a patient’s story into the unexpected. A lot of therapy involves digging into what’s under the surface.

Plus, a chatbot can’t catch subtle red flags (a downward look, a body posture, a quiet tone) that may indicate a person is in crisis.

When I interact with a chatbot, I can nudge it toward the answer I want, reshaping my prompt until the reply lines up with the conclusion I’ve imagined. Unfortunately, this can play out in dangerous ways.

It brings to mind an adolescent client I once worked with whose thinking had become so narrowed that she couldn’t see beyond her own fearful assumptions. I worry what might happen if someone with similar patterns relied on a chatbot for guidance. When I hear tragic stories like the teens in Massachusetts and here in Florida who have died by suicide after interacting with chatbots, I get even more concerned.

In a good therapeutic environment, a therapist is going to challenge your beliefs and assumptions to help you grow. If that doesn’t happen, you won’t be pushed outside your comfort zone, and you may end up deepening the very patterns you’re trying to break.

A chatbot responds instantly, and it’s always available. But at the end of the day, it’s still a machine — not a friend and certainly not a therapist. Depending on a bot for emotional support can actually deepen loneliness over time. I’ve seen this with phones. Time spent hunched over a screen, trading real, in-person interactions for virtual ones, has negatively impacted many of my clients.

AI chatbots have the potential to take the emotional and mental harms of tech use to a whole new level. They can so easily mimic a real conversation in ways that feel surprisingly personal. For someone already feeling lonely, depressed or emotionally overwhelmed, turning to AI for connection can backfire. It can leave them even more cut off from the real relationships and interactions that are vital for connection and emotional well-being.

I do believe there’s a place for AI in mental health care, as long as it functions as support, not as the primary source of care. AI can certainly help reduce administrative burdens, streamline scheduling, or handle routine tasks, but the core therapeutic work should stay in the hands of a human therapist.

I also see value in using AI as a kind of preliminary step toward further help. For example, someone who’s struggling might run a question or concern through a chatbot, not for a final answer, but as a starting point they can bring into a therapy session for deeper discussion and reflection.

There’s no doubt that AI can offer convenience, information and even moments of comfort. But therapy is more than well-phrased responses. It involves insight, empathy and a relationship built over time. Before we assume technology can fill that role, we should remember what’s uniquely human about healing and do whatever we can to protect and preserve that.

Brianna Hardcastle is the clinical director at Tampa’s River Oaks Treatment Center, an American Addiction Centers facility. River Oaks offers a dedicated residential Primary Mental Health program alongside addiction treatment services

Comments are closed.