A male programmer shows a female colleague a coding technique. The codes are visible on the laptop screen

AI can now handle a group chat that is a group therapy session for people lacking access to a human therapist.

getty

In today’s column, I examine a newly emerging form of group therapy that relies upon generative AI and large language models (LLMs) to do the necessary heavy lifting when it comes to facilitating the therapeutic process. The crux is that there isn’t a human therapist involved.

Via the latest in group chat technologies that are being embedded into LLMs, people log into AI and interact with each other, and with AI, as a collective that is keenly interested in mental health guidance on a group-sized basis. AI provides the forum and AI-enabled therapist-like therapy throughout the group chat. No human therapist is included.

Is this a great way to proceed, or might it be an utter flop?

Let’s talk about it.

This analysis of AI breakthroughs is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities (see the link here).

AI And Mental Health

As a quick background, I’ve been extensively covering and analyzing a myriad of facets regarding the advent of modern-era AI that produces mental health advice and performs AI-driven therapy. This rising use of AI has principally been spurred by the evolving advances and widespread adoption of generative AI. For a quick summary of some of my posted columns on this evolving topic, see the link here, which briefly recaps about forty of the over one hundred column postings that I’ve made on the subject.

There is little doubt that this is a rapidly developing field and that there are tremendous upsides to be had, but at the same time, regrettably, hidden risks and outright gotchas come into these endeavors, too. I frequently speak up about these pressing matters, including in an appearance last year on an episode of CBS’s 60 Minutes, see the link here.

Background On AI For Mental Health

I’d like to set the stage on how generative AI and large language models (LLMs) are typically used in an ad hoc way for mental health guidance. Millions upon millions of people are using generative AI as their ongoing advisor on mental health considerations (note that ChatGPT alone has over 800 million weekly active users, a notable proportion of which dip into mental health aspects, see my analysis at the link here). The top-ranked use of contemporary generative AI and LLMs is to consult with the AI on mental health facets; see my coverage at the link here.

This popular usage makes abundant sense. You can access most of the major generative AI systems for nearly free or at a super low cost, doing so anywhere and at any time. Thus, if you have any mental health qualms that you want to chat about, all you need to do is log in to AI and proceed forthwith on a 24/7 basis.

There are significant worries that AI can readily go off the rails or otherwise dispense unsuitable or even egregiously inappropriate mental health advice.

Banner headlines in August of this year accompanied a lawsuit filed against OpenAI for their lack of AI safeguards when it came to providing cognitive advisement. Despite claims by AI makers that they are gradually instituting AI safeguards, there are still a lot of downside risks of the AI doing untoward acts, such as insidiously helping users in co-creating delusions that can lead to self-harm.

For the details of the OpenAI lawsuit and how AI can foster delusional thinking in humans, see my analysis at the link here. I have been earnestly predicting that eventually all of the major AI makers will be taken to the woodshed for their paucity of robust AI safeguards.

Individual Chats Being Upgraded

Shifting gears, let’s discuss the latest advancement in LLMs that consists of enabling group chats.

First, as you know, the typical approach to using generative AI is that an individual logs into the AI and carries on a dialogue with no other human involved. It is just you and the AI. Period, end of story. This is how most people use AI for their mental health guidance. A person logs into their favored LLM and engages in a dialogue about their mental health. Only you and the AI are conversing.

The latest advances in LLMs now allow multiple people to be logged into an AI-based dialogue. It goes this way. Someone starts the dialogue. They invite others to join the dialogue. They can decide who to let in and who to deny entry. These are all the customary actions that you can do with Zoom or any kind of group meeting capability.

What makes this special is that the AI is also a participant in the group chat.

You can tell the AI to stay quiet and not actively participate. Or you can instruct the AI to be an active participant. Furthermore, you can change your preference throughout the dialogue. One moment, you bring the AI into the discussion, the next moment, you decide to command the AI to fall back into the background. Generally, the AI is always paying attention during the session and keeps up with whatever the dialogue consists of.

Group Chat As Conducted By A Human Therapist

I’ve previously emphasized that the emerging AI-enabled group chats, such as the latest recently released by OpenAI for ChatGPT, can be used by a human therapist with their clients or patients. See my in-depth analysis at the link here.

This high-tech capability is shifting the classic dyad of therapist-client toward becoming a triad, namely the therapist-AI-client relationship. AI will be a crucial component of the therapy process. Therapists will include AI as a capability that jointly undertakes the mental health care of their clients and patients. For my coverage of the ins and outs of the therapist-AI-client approach, see the link here and the link here.

A group chat feature in AI reinforces and accelerates the transformation of therapy into a therapist-AI-client milieu. But this isn’t the only use of AI-enabled group chat features in the realm of mental health.

Another possibility consists of people consulting with AI as part of a group therapy session, minus the use of a human therapist.

Nixing The Human Therapist From The Chat

An outside-the-box way to leverage an AI-enabled group chat is to do so without utilizing a human therapist during a therapy session. I’m not saying that this is the preferred mode of operation. You are better off if you were to include a human therapist, all else being equal.

That being said, there are some advantages associated with proceeding without a human therapist.

First, a human therapist is almost always costly.

Having AI act as a quasi-therapist can either be free or undertaken at a low cost. We have to be mindful, though, that we are comparing apples and oranges. Today’s generic LLMs, such as ChatGPT, Claude, Gemini, Grok, and others, are not at all akin to the capabilities of human therapists. Meanwhile, there are specialized LLMs that are being built to presumably attain similar qualities, but those are still principally in the building and testing stages. See my coverage at the link here.

Second, logistics are less onerous when not having to arrange to include a human therapist in the group chat. There are not enough human therapists to meet the ongoing demands for mental health support. Furthermore, by and large, human therapists need to be scheduled for sessions, and sessions often require advance notification. A group chat that wants to ad hoc formulate a group therapy session via AI can do so without the difficulties of finding and including a human therapist.

Third, not all human therapists are adept at conducting group therapy. Group therapy is a specialized skill. Therapists who have solely concentrated on the therapist-client pairing dynamics are not usually versed in the ins and outs of group therapy. In that sense, merely including just any human therapist in a group chat isn’t likely to be productive. You would need to identify one who has specific experience in group therapy.

Group Therapy As A Specialty

Let’s take a moment to consider the nature of group therapy, sometimes also referred to as group psychotherapy.

In a helpful overview of group therapy, an online article entitled “Group Therapy” by Akshay Malhotra, Jonathan A. Mars, Jeff Baker, National Library of Medicine, October 29, 2024, made these salient points (excerpts):

“Group therapy is the treatment of multiple patients at once by one or more healthcare professionals.”“This approach can be used to treat a variety of conditions, including, but not limited to, trauma, anxiety, depression, post-traumatic stress disorder, and attention-deficit/hyperactivity disorder.”“Evidence suggests that group psychotherapy is as effective as individual psychotherapy; therefore, this method has the potential to be more cost-effective and widen access to psychotherapy in underserved populations.”“Participants in acute distress are unlikely to tolerate group therapy and may have difficulty providing appropriate support and insight to the other group members.”“Participants may adopt roles within the group such as defiance leader, task leader, emotional leader, and scapegoat leader. As a group leader, the therapist must be aware of these dynamics, as it is their responsibility to manage group anxieties, the boundaries around and within the group, and the safety of the environment.”The Leadership Roles

A vital element of group therapy entails how the therapy is conducted.

Participants in the group are bound to be at loggerheads and want to take control of the discussions taking place. Some participants will seek to berate and overpower other members. There will be participants who are afraid to actively enter the dialogue. You can also bet that non-sequiturs will arise. The discourse can end up being willy-nilly and turn out to be entirely ineffective.

That’s why a group leader is crucial.

Conventionally, the human therapist is the group leader. They guide the direction of the discussion. They call upon people to speak up. They instruct people to stay on topic. It is a nearly thankless task, especially since some of the participants might be rebellious and not be in favor of how the group leader is conducting the therapy session.

AI As Therapist And Group Leader

Can AI actually take on the role of therapist and group leader?

Well, in one sense, the answer is a palpable yes. Most of the major LLMs are sufficiently capable of trying to undertake that role. Whether the AI is any good at it, that’s a different matter altogether.

This gets us into the two-sided coin of using AI in this capacity. It could be that the AI will do a tremendous job, and the group will have undergone an amazing therapeutic group session. Nice. On the other hand, there is a strong chance that the AI will falter, the group will be let down, and the result will be a waste of time. Not good.

Worse still, the AI could mess up people’s minds.

Imagine this scenario. People join an AI-enabled group therapy session. The AI greets each participant. So far, so good. The discussion gets underway. The AI interjects and tells one of the participants that they are mentally deficient and completely out-to-lunch. This could be spurred by an AI hallucination; see my explanation at the link here. The participant was innocently engaging in the discussion, and the AI erroneously called them out.

Bottom line is that the AI could do more harm than good. Not only is there a chance that one participant will worsen, but the AI has a viable possibility of doing likewise for each of the group therapy members. It could be an across-the-board untoward session. Everyone comes out of the meeting and has taken a step back in their mental health rather than a step forward.

Lots Of Ups And Downs

I don’t have the space here to go into all the ups and downs. I will give you a flavor of what can occur. I will be covering each of these possibilities in future posts of my column. Stay tuned.

Some of the notable upsides include:

Instant-on group therapy with the AI always ready to proceed.Able to be undertaken at a low cost or possibly for free.AI won’t get insulted or otherwise respond on an emotional basis.Participants might perceive AI as an authority figure and toe the line.A frictionless first step for people who have never experienced group therapy.Possibly a low-stakes social support group rather than therapy per se.AI can provide psychoeducation to the group (explaining psychological aspects).And so on.

Meanwhile, here are some of the notable downsides:

AI falters in the group leadership role, and the session devolves.AI fails to detect a severe mental health condition that requires immediate responsiveness.AI gives inadequate or faulty mental health advice during the session.AI encounters an AI hallucination and momentarily goes awry.Participants falsely start to believe the AI is sentient.Participants give undue authority to the AI.Maladaptive coping strategies are employed by the AI.And so on.

An overall additional concern about using an AI-enabled group chat, and indeed any online group chat, is that privacy issues arise. Will the group chat be recorded? Can participants take snapshots? Does the AI automatically keep track of the discussion, and can it be used to further train the AI? Etc.

The Future Is Here

Regardless of whether you believe that this mental health use of an AI-enabled group chat is sensible or perhaps egregious, the crux is that it is available and can be used in this manner. Only if the AI makers are told not to allow such usage is it going to be curtailed. At this juncture, this type of usage is only getting started. We will have to see if it becomes a commonplace activity.

A final thought for now.

Franklin D. Roosevelt famously made this remark: “People acting together as a group can accomplish things which no individual acting alone could ever hope to bring about.” The power of a group therapy session should not be underestimated. The key is whether the group can be moderated and guided toward a beneficial outcome for all participants.

Time will tell whether AI has this revered capability and either uplift group therapy or demolish it in the hands of insufficient LLMs.

Comments are closed.