JACKSONVILLE, Fla. — Action News Jax investigates the challenges and opportunities artificial intelligence is creating for Florida therapists.

Recent research published in the Journal of the American Medical Association, JAMA, found 13% of kids between 12 and 17 years old and 22% who are eighteen and older said they have used A.I. for mental health advice.

>>> STREAM ACTION NEWS JAX LIVE

About a year ago, licensed mental health therapist Tammy Horn noticed a pattern, especially with her GEN Z patients. They are first-generation born into a world with the internet, smartphones, and social media.

Horn said, “I started to realize a lot of my clients were going to ChatGPT to get help or support when they weren’t in therapy with me. Unfortunately, the advice that they were getting wasn’t what they needed.”

Instead of pushing her clients away from A.I., Horn says she embraced it. She created her own platform called Curapractice and an A.I. therapy assistant.

Horn said, “This is our opportunity, for the first time in I don’t know, 150 years to redesign what mental health looks like for our clients.”

Five of her patients agreed to be in a pilot program to try it out. Horn says her A.I. chatbot assistant responds in her voice and is trained on treatment models, ethical guidelines, and customized to the needs of each client.  

Horn said her ultimate goal is for other licensed therapists to use her platform to integrate A.I. into their own practices.

Horn said, “So my bot can reach out to the client and say, ‘Hey, how are you doing today? Is everything going okay? You were supposed to work on X, Y, and Z in the last couple of days. How’s that been for you?’”

 Horn said her chatbot is also trained in crisis detection and escalation and will contact her, 9-1-1, and designated emergency contacts if a patient signals they are in crisis or danger.

Horn said the A.I. chatbot is only supplemental, and her clients still meet with her and maintain that human contact.

Human connection is what Diana Huambachano said must be at the center of mental healthcare.

Huambachano said, “No matter how awesome the bot is, no matter how many therapists come together and make a bot, it’s just, there’s something that happens in the therapy room, in the therapy space that you cannot get anywhere else.”

[DOWNLOAD: Free Action News Jax app for alerts as news breaks]

She is the executive director of the Florida Mental Health Counselors Association

The organization recently held a virtual A.I. Summit for mental health professionals. The event emphasized awareness, answered questions, and encouraged caution when it comes to incorporating A.I. into their practices.

Huambachano said, “Let’s make sure we’re doing things ethically and compliant for the safety of our clinicians. And also, to protect our own licenses. And the guidelines are not fully out. They’re not there.”

 Huambachano said she is not against A.I. and thinks it can provide benefits like reducing paperwork or analyzing data on patient populations or outcomes. 

But she said therapists must prioritize patient privacy and be aware of how any information given to A.I. is being used and stored.

Huambachano said, “It’s our responsibility. It’s not our patient’s responsibility; it’s our responsibility to be informed.”

 Both Huambachano and Horn agree that artificial intelligence is here to stay, and it’s up to therapists to handle their patients and this technology with care. 

You can read the American Mental Health Counselors Association 2023 code of ethics addendum on A.I. here.

[SIGN UP: Action News Jax Daily Headlines Newsletter]

When working on this story, Action News Jax Anchor Tenikka Hughes asked A.I. platforms ChatGPTClaude, and Gemini whether A.I. platforms should be used for mental health support or advice. All three platforms replied that they were supplemental or complimentary tools and should not replace professional human care.

The platforms noted some benefits they offer to users, like being accessible, nonjudgmental spaces, or can be used for self-care reminders or tracking wellness goals.

As far as some of the limitations or risks when using A.I. chatbots:

ChatGPT said in part, “AI can sometimes provide inaccurate, inappropriate, or even harmful responses, especially in crisis situations like self-harm or suicidal ideation.”

Claude said, “Over-reliance on AI could prevent someone from seeking necessary professional help.”

Gemini responded, “While AI can mimic empathetic language, it lacks true feeling, emotional depth, and human intuition.”

Florida representative Christine Hunschofsky of Parkland recently filed a bill that aims to ban the use of A.I. in psychology, clinical social work, marriage and family therapy, and mental health counseling. It would allow A.I. to be used to transcribe notes from a session, with the patient’s written permission.

Click here to download the free Action News Jax news and weather apps, click here to download the Action News Jax Now app for your smart TV and click here to stream Action News Jax live.

Action News Jax Top Stories

Comments are closed.