Amid rising AI integration in mental health care, experts in the Commonwealth seek regulations to ensure safety, patient privacy, and the essential human element.

HARRISBURG, Pa. — Artificial intelligence is expanding into nearly every part of daily life, including mental health care. As adoption rises, so do concerns about safety, privacy, and effectiveness.

Mental health experts are proposing H.B. 2100 to understand how AI can be integrated into care across the commonwealth.

“It’s the wild, wild west in terms of regulation in the area of AI and counseling,” Pennsylvania Psychological Association‘s executive director Rachel Baturin said.

A hearing held this week in the Pa. House brought together mental health officials and licensed counselors to discuss ways to regulate how AI can be used safely with a House bill.

“These bots are built to be very affirming, and I recognize what they’re doing,” Dr. Curtis Taylor, a licensed professional counselor from Erie, Pa., said in the hearing. “That’s part of rapport-building that is part of counseling, but we’re not hyper-affirming to the point that we’re throwing away, you know, common sense and safety.”

The goal is to find ways to integrate AI programs to assist with administrative tasks, counseling strategies, and organizational work while abiding by patient confidentiality and ensuring human oversight during use.

“It’s when AI is being used as a standalone tool or instead of the professional where I think things go awry,” said Dr. Molly Cowan, a licensed psychologist and director of professional affairs at PPA.

Dr. Kathleen Dougherty, vice chair with Penn State Health’s Department of Psychiatry and Behavioral Health, explained that online chat services do not replicate the emotional support and human connection found in one-on-one counseling.

“I really don’t think that it could ever fully replace therapy because it’s very nuanced: what somebody is saying, how they’re saying it, why they’re saying it, when they’re saying it,” Dr. Dougherty said.

Officials also voice concerns about where people’s data goes after it is entered into these systems as well as whether it can be mined for advertising or profit, and what oversight exists for the technology.

Dr. Dougherty, who uses AI to assist with daily tasks, is operating a pilot program with Penn State Health focused on note-taking and other administrative services.

She shared with FOX43 that the concern, though, is deeper than just asking a few questions in a chatbox.

“They have spent hours a day just talking to their AI companion, getting affirmation back — and that really removes people from the real world,” Dr. Dougherty said.

Comments are closed.