In mental health, there is a growing divide centered around one hot topic: the role of technology and AI in treatment.
There is a widely held belief in tech that therapists are resistant to change, or anti-innovation. We are here to set the record straight.
If you are building these tools, investing in them, or wanting to create the next great thing that will transform the industry, we’re here to let you in on some inside information.
Therapists and mental health professionals aren’t afraid of technological progress, but we are rightfully concerned about the consequences of it being built without us.
Therapy isn’t just another industry to modernize.
Therapists are facing a crossroads, often split as to whether AI in mental health will exacerbate the burnout and workforce loss problem, or finally support clinicians and enhance care.
In our experience, the biggest indicator of the success or safety of a new tool intended for therapy is whether experienced therapists are involved in its development, ensuring compliance and safety standards we hold responsibility for, by being brought in from the very conception of a product, not just as testers before launch. We need tools that are shaped the way we are: independent, autonomous, and diverse.
If technology reshapes therapy without understanding therapy, we risk solving the wrong problem and, even worse, creating new ones.
The hardest part of therapy isn’t what you might think
When we started our practice, we did everything right.
We researched the best tools to use, interviewed the experts, hired exceptional therapists and engaged a strong administrative support team. We should have been doing well, but we were stuck.
We were running into the same problem over again, threatening to shut us down and disrupt our clients. It wasn’t clinician burnout, client trauma or overwhelming emotional work. It was the unpaid administrative labor.
It turns out, it wasn’t just us, but a widespread problem in the healthcare industry. Approximately 40% of tasks that are required to ethically and safely take care of our clients are tasks that are unpaid, unreimbursable, and considered unessential by the industry.
Yet, the current market treats the national, ongoing mental health crisis like a productivity or workflow problem.
According to the National Bureau of Labor Statistics, 54% of master’s graduates in mental health counseling never become licensed, despite spending up to 7 years in school and several post-graduate years under strict supervision performing the same clinical work as a licensed provider.
We genuinely believe many of the people building mental health technology have good intentions. They want to help.
However, most technological tools are being designed with the payer or consumer in mind, which creates a gap that is more than inconvenient, it is wildly unsafe.
We are seeing a pattern in the industry in which tools either fail to launch, frustrate the clinicians expected to adopt them, or ultimately disappoint the clients they are trying to reach. Many of these teams have the same thing in common, the professionals experienced in doing the work were not meaningfully involved in building the solution.
When you take a look, how many of these companies promising to solve the mental health crisis involve actual therapists, who have been in the trenches, working with clients, who can tell you what they need to succeed in treating clients?
Many have perhaps one or two providers in leadership, but many if not most have already decided they know what the professionals want and need, without ever addressing the professionals themselves. Others are bringing in therapists looking for feedback, not development. Experts are brought in last, instead of first.
In an enthusiastic effort to get in on the problem, the experience and training of the experts in the field is being treated as optional.
Risk, or what we can’t afford to get wrong
If there is one thing we can guarantee happens in every single session, every single day, all over the country, it’s this: Clinicians are constantly evaluating client safety.
Above all else, our duty and our mandate is safety for our clients. This doesn’t just mean safety in the moment, but safe physical and emotional relationships, safe boundaries, and safe experiences.
Protection of information matters, too. One breach or one mistake, and it’s not the company or the technology that will be held responsible. It’s my license, my career, and ultimately, my responsibility.
Therapists are responsible both for what we control and what we don’t, the things we know and the things we should have known to prevent harm, even when the control of the tech we use is out of our hands.
Do you know any other industry held to this standard?
Other professional fields, such as medicine or aviation, also face serious individual responsibilities, however, in most cases, those professions have entire systems to distribute risk. In mental health, the therapist often is the system: the clinician, risk manager, compliance officer, and confidential gatekeeper, all while remaining personally liable for tools they didn’t build or control.
When technology fails, the liability does not fall to the software, the company, the builders.
In mental health, the therapist carries that liability alone.
What kind of future do we create when therapist voices are present?
Ultimately, what keeps us up at night is this: Once these tools are perfected to reduce cost, could they eventually be used to determine that certain people don’t deserve a therapist at all?
Somewhere in a board room, tech leaders are discussing whether a chatbot could substitute for human therapy. This is not new, most institutions already have processes in place to identify individuals that are “not symptomatic enough” to justify the cost of care. Framing this as cost savings instead of patient first care is not only asking the wrong questions, but declaring a solution based on a dangerous assessment that has significant, real life consequences.
By involving highly experienced therapists and mental health administration experts from the beginning of development of new tech, we not only build tools that are more effective, but also build trust and confidence among clinicians. This early input allows us to proactively address concerns around safety, compliance, and liability, ensuring that the technology aligns with real world needs.
In doing so we can both empower therapists with tools that support their practice, and safeguard against dangerous errors in otherwise well meaning tech.
When we have therapists saying this is safe, this is trustworthy, this is useful, we have marketable tech that therapists will want to adopt into their practice.
We’re not saying don’t build it, we’re saying build it with us.
Therapists aren’t resisting modernization, we are guardians of safety and trust. We want tools that move us forward, and we want to shape them. If the future of therapy is technology that protects clinicians’ time, voice, and autonomy, we all win.
Photo credit: Olga Strelnikova, Getty Images
Kira Torre, LMFT, and Emily Daubenmire, CPC, are co-founders of a mental health group practice with a simple mission: prioritizing therapists means prioritizing patient care. Working at the intersection of clinical practice, operational leadership, and digital health innovation, they bring a unique perspective to the next generation of mental health care and, together, advocate for ethically aligned technology in behavioral health.
This post appears through the MedCity Influencers program. Anyone can publish their perspective on business and innovation in healthcare on MedCity News through MedCity Influencers. Click here to find out how.
