Kaiser has been battling these industry dynamics for more than a decade. California regulators have cited the company multiple times and fined it twice for making patients wait too long for mental health appointments, ordering Kaiser to address understaffing.

Administrators are actively exploring how AI tools could help expand access to therapists, for example, by helping them spend less time on paperwork and more time with patients.

Kaiser declined several requests for an interview, but said in a statement that AI tools don’t make medical decisions or replace human care. Rather, they hold “significant potential to benefit health care by supporting better diagnostics, enhancing patient-clinician relationships, optimizing clinicians’ time, and ensuring fairness in care experiences and health outcomes by addressing individual needs.”

Kaiser contracts with mental health workers typically span two to four years. The company did not respond to specific questions about how AI could lead to job losses during that timeframe.

Managers told the union during negotiations that they do not “intend” to lay off therapists because of the technology, but when pressed to put that in writing in the contract, several union representatives, including Marcucci-Morris, said Kaiser told them, “We can’t predict the future. We need to maintain flexibility,” and “We want to leave our options open.”

How Kaiser uses AI now in mental health care

Kaiser is already deploying AI note-taking technology in mental health care. Piloted first in medical exam rooms, these digital scribes record interactions between doctors and patients, then generate summaries for the patient’s medical record. Many mental health clinicians are optimistic about this innovation, as they typically spend two and a half hours a day, often in the evenings, writing clinical notes.

“It’s called pajama time,” said Jodi Halpern, a psychiatrist and professor of bioethics at UC Berkeley. Her research shows that paperwork is the biggest cause of burnout among clinicians. “So the idea that we could replace that so that human care could grow, I love that idea.”

Kaiser mental health care workers and supporters march from Oakland Kaiser Medical Center to Kaiser’s corporate headquarters on Friday, Aug. 19, 2022, the fifth day of an open-ended strike. (Beth LaBerge/KQED)

The technology is controversial among Kaiser clinicians, though. Some appreciate digital scribe software as a time saver that also allows them to be more present with their clients, making eye contact rather than typing. But many are wary of potential privacy breaches, the ethical implications of using therapy transcripts to train AI models, and whether patients might censor themselves when they’re being recorded. Marcucci-Morris has declined to use it for these reasons, anticipating that only one out of 10 of her patients would consent to it if she asked.

“It’s not the same as talking to your physician about a rash or your vitamin D deficiency,” she said. “I wouldn’t want a recording of my disagreements with a family member or details of the terrible things that have happened to me.”

In light of the unknowns, therapists have asked Kaiser management for a contract clause that stipulates the use of digital scribes will remain optional, or at least “not mandatory,” but Kaiser declined the proposal.

The union is also concerned about Kaiser’s recent introduction of electronic mental health triaging, an optional tool where patients are routed into care based on how they answer questions about anxiety and depression in an online questionnaire.

Brittany Beard, a licensed clinical therapist at Kaiser Permanente, poses for a portrait at her home in Vallejo on Nov. 24, 2025. (Gina Castro/KQED)

Some patients won’t like this, but some will prefer it, said Merage Ghane, a clinical psychologist and director of responsible AI at the Coalition for Health AI. “There are people who really don’t like talking to a real person,” she said.

Vallejo-based therapist Brittany Beard used to do this triage work herself, talking to clients for 15 to 20 minutes on the phone, but after Kaiser outsourced many of those calls to an outside company and developed the e-visit, she was reassigned to a new department. Though still employed at Kaiser, she already feels replaced by an app.

“They sell it as accessing care faster, but I’ve seen the opposite,” Beard said. Now, when some of her patients meet her for their first appointment, “They’re frustrated. It was like they were battling just to get to me.”

Is AI coming for your therapist?

How much AI infiltrates mental health care will be determined, in part, by the consumer. Experts have identified a “trust gap” between health administrators’ eagerness to roll out AI tools and patient concerns; to bridge the divide, they recommend transparency and involving patients in implementation. Qualitative studies show that patients are optimistic about the technology’s potential to improve diagnosis and treatment, but they remain skeptical of “robots” or “machines” taking over from humans.

“The prevailing sentiment really was that AI is at its best when it’s a tool that doctors can use to do their jobs better. Once that moved into the realm of replacing human interaction and experience, that was not a good thing,” said Michele Cordoba, a researcher at Culture IQ, which produced a report for the California Health Care Foundation.

Kaiser mental health care workers and supporters march from Oakland Kaiser Medical Center to Kaiser’s corporate headquarters on Friday, Aug. 19, 2022, the fifth day of an open-ended strike. (Beth LaBerge/KQED)

At the same time, the use of commercial AI chatbots for mental health has soared. One study surveyed AI users who have mental health conditions and found nearly half turn to their chatbot for psychological support, and of those, 63% said the advice was helpful.

But mental health professionals have questioned the efficacy of such advice, and several families have sued AI companies, alleging their chatbots encouraged suicidal and self-harming behavior.

In the meantime, clinical psychologists are developing evidence-based chatbots, like TheraBot, to deliver tested therapeutic guidance. The Food and Drug Administration acknowledged the broad demand for such apps at a November meeting and is exploring what kind of authority it might have to regulate them, including requiring human mental health professionals to oversee them.

Comments are closed.