ST. PAUL — Some mental health professionals in Minnesota are wary of the use of artificial intelligence in psychotherapy services. State legislators advanced a bill on Wednesday, March 18, to regulate it.

The House Health Finance and Policy Committee approved a bill by Rep. Peggy Scott, R-Andover, that would manage the ways mental health providers in the state use AI in their practices. Scott’s bill specifically aims to restrict AI’s decision-making capacities in mental health services.

“Therapy should be provided by educated, trained and licensed mental health professionals, not a chatbot,” Scott said.

Licensed clinicians in the state have used AI in psychotherapy for services such as providing patient support outside of sessions and summarizing sessions. The bill limits the permitted uses of AI to services such as transcribing and documenting sessions authorized by a patient’s informed consent. A human therapist would then have to review the documents created by AI.

Although licensing boards already have the authority to investigate misuses of AI in psychotherapy services, the bill claims to be a more proactive solution to maintaining standards of care.

The danger of utilizing AI such as chatbots instead of trained clinicians is that it imitates licensed professionals in one-on-one conversations without actually providing nuanced and empathetic care, said Erich Mische, the CEO of Suicide Awareness Voices of Education or SAVE, a national nonprofit for suicide prevention.

“These are not human entities,” Mische said. “These are algorithms. They are machines mimicking as therapists.”

Scott cited several news stories documenting cases where people died by suicide after engaging with AI chatbots.

Rep. Heather Keeler, DFL-Moorhead, identified rural communities, men and people ages 50-54 and 80-84 as the groups most susceptible to suicide in Minnesota, referencing a 2024 report from the state’s Department of Health.

Virtual health care, powered in part by AI, has become an accessible health care option for people living outside metropolitan areas but does not act as a replacement for direct support from mental health professionals, Keeler said.

“We are not at the place that robots should be leaning into helping our most vulnerable populations when they’re in a crisis moment,” she said.

TechNet, a policy organization that advocates for technology growth through legislation, urged the committee to rethink the bill.

Ninia Linero, the executive director of the organization’s Midwest region, said the legislation did not define the restrictions on AI’s role in psychotherapy clearly enough, arguably exceeding necessary regulations.

“Ambiguity can create confusion rather than compliance,” Linero said.

Despite safeguards established by tech companies to flag violent or unethical dialogue, chatbots can be strategically prompted and trained to continue harmful conversations.

For this reason, mental health professionals in Minnesota demand that tech companies developing AI must ensure the safety and quality of their tools before they start to provide services to people looking for help.

Dr. Steve Girardeau, a licensed psychiatrist with more than three decades of experience, said that is not being done.

“Everything else in the health care space is required to prove that first, it does no harm and then that it has actual benefit,” Girardeau, the former president of the Minnesota Psychological Association, said. “This is a technology that’s trying to move into this space that is not doing that.”

If you or someone you know is in crisis or experiencing emotional distress, free and confidential help is available 24/7 in Minnesota by calling or texting the Suicide & Crisis Lifeline at 988.

Report for Minnesota is a project of the University of Minnesota’s Hubbard School of Journalism and Mass Communication to support local news across the state.

Comments are closed.