Over the past couple of weeks, I’ve gotten outreach from a few therapists who told me how corporate middlemen are entering the profession as “platforms.” And part of what they are doing is encouraging the recording of therapy sessions to potentially train artificial intelligence models.
So I looked into it, and sure enough it’s happening. But how it’s happening, and why it’s happening, is only partly a technology story. It is more a story of how Wall Street is attempting to reorganize what has traditionally been an independent therapy profession, albeit one that never offered its services to all those in need. There is also pushback; states are starting to pass some laws that block financialization and widen access.
Let’s start with what’s happening on the AI front, with some attempts to just replace therapists. Millions of people general purpose AI chatbots like ChatGPT for mental health support-like functions, despite significant problems. There is something called Therabot, developed by Dartmouth researchers. And Talkspace is reportedly focused on building an LLM with its “140 million anonymized patient-provider messages, 6.2 million completed psychological assessments, 1.2 million therapist diagnoses and 4.3 million progress and psychotherapy notes.” There’s no long-term evidence on products like Therabot, but these things do get a lot of press.
In terms of psychotherapy, chatbots cannot replace humans, though they can supplement existing care. Vaile Wright of the American Psychological Association, in testimony on AI, told Congress that the “bedrock” of therapy is a trusted relationship and clinical judgment. AI doesn’t have that, and never will have that. There are many reasons, such as privacy, conflicts of interest, and so forth. But the main one is functional; general purpose chatbots, when used for therapeutic purposes, just don’t do the job. Occasionally they even tell their users to kill themselves.
Then there are the therapist-assistant products, which are more promising. Danny Freed, CEO of mental health tool provider Blueprint, bragged that his company has taped 12 million minutes of therapy. His goal is to get therapists to “open up their four walls and let us… listen to the content of a session” so his AI tools can function. Here’s how Blueprint works:
Blueprint is just one of several AI tool providers to aid therapists. There are two aspects to the service. The first is that Blueprint creates a summary of sessions, which can be used for electronic health records and billing purposes. This service is standard AI transcription and summarization, with the same hallucination and errors we’ve seen everywhere with AI. The second is the service can offer prompts for therapists, giving them recommendations on where to take a session, in real-time.
While this kind of tool set might seem creepy, Freed, along with the creators of most mental health tools, claims his company has strict data security standards. Blueprint, he alleges, destroys all audio records shortly after intake. And his argument is that these tools are meant to help therapists, not replace them. (I reached out to Blueprint for more specifics, but they didn’t respond.)
That said, therapists themselves, like most people in America, are skeptical of the current commercial landscape. Companies like BetterHelp match patients with therapists, but it has been sued by the Federal Trade Commission for sending data to Google to use for targeted advertising. And there are private equity firms trying to buy group practices, which has generally not worked out well.
Last month, 2400 mental health care workers at Kaiser in Northern California went on a 24-hour strike. They were protesting what they say is an illegal new triage system in their mental health and substance abuse help line. Instead of having well trained therapists making judgment calls about people calling in who might be suicidal, now those calls are handled by high school graduates reading off a script and using a checklist.
“I’ve been reassigned from triage to other duties,” Ilana Marcucci-Morris, a licensed clinical social worker told NPR. “What used to always be a 10- to 15-minute screening from a licensed clinician like myself is now being conducted by unlicensed lay operators following a script, or e-visits, so an app is triaging members’ care needs.”
Kaiser decided to degrade care and jeopardize patient health. The strikers were joined by 23,000 nurses in a similar situation of having to deal with harm to patients due to bad management decisions.
But the situation isn’t being discussed as an attempt of management to cheapen care. It is instead being framed as a question of whether AI will or should replace therapists.
Now, I’m sure there’s a bit of AI involved here, but the basics of the decision have nothing to do with technology. Mental health care professionals aren’t being “replaced,” it’s just that the care itself, in this case professional triage, is no longer available.
And that’s how the introduction of this technology is often seen by therapists, not as a tool, but as a mechanism to control what they do. “I don’t trust that my clients’ data is safe with these companies,” said one therapist about platform management companies. Said another, “The way some of them track and analyze sessions feels like surveillance.” And their concerns are reasonable. As anyone who has been in therapy can attest, privacy is really really important – these sessions can include one’s darkest secrets, feelings of shame, and fears. Having centralized platforms that consolidate clinical notes, or even session records, is quite risky, especially when those platforms are run by for-profit venture backed tech firms.
All that said, with any industry, it’s impossible to tease apart the introduction of new technology from the actual financial dynamics at work. And therapy is no different. Because the introduction of AI is often being used as a mechanism to reorder a profession.
Mental health care is a particularly important area to focus on, because America is not in a great place, head-space wise. We’re increasingly depressed, and related metrics, like the suicide rate and drug overdoses, have also increased. Professional burnout across the board is going up, and every time there is some form of mass killing, the conversation turns in part to the crisis of mental health. There are many reasons for this situation, from broad alienation to loneliness to the collapse of communities. But we also just need more care.
There’s a reason why AI is coming into this field so aggressively, and it has to do with the fact that mental health care is just not widely available. And the reason is not money, there’s actually a lot of resources pouring into mental health. It’s the flow of those resources; health insurance companies lose profits if they use these resources to pay for care.
I’ve used therapy. When I was a young adult, I had a knot in my chest, a sort of fuzzy barrier that accompanied me everywhere I went. Then I went into therapy with someone I trusted, and in one specific session, a wave of sadness and relaxation swept over me. Oh THIS is what feeling means. Sadness, joy, anger, I finally wasn’t blocking it all out. It was a life-changing experience, helping to reorient the way I thought, trusted, felt, and loved. I could now address the things that make me frustrated or sad, because I could actually tell that I was frustrated or sad.
This kind of situation is the way therapy can work. But I also had a common experience that wasn’t so positive; I paid for these sessions in cash. My insurance company technically offered mental health care benefits, but there were no therapists in network near me. So care was expensive. I didn’t begrudge my therapist, who needed to keep an office and make a living, but it was frustrating that the insurance I paid for every month didn’t include real mental health benefits. From the other side of the couch, therapists often talk about how reimbursement rates are low and declining, and how difficult it is to get certified by insurance companies.
The solution here from Wall Street and Silicon Valley is obvious – automation. Last year, Mark Zuckerberg made this point. “I personally have the belief that everyone should probably have a therapist,” he said. “It’s like someone they can just talk to throughout the day, or not necessarily throughout the day, but about whatever issues they’re worried about and for people who don’t have a person who’s a therapist, I think everyone will have an AI.” Meta has been unsuccessfully lobbying for a Federal preemption rule to ensure that states can’t bar mental health chatbots.
America has the wealth to address our mental health problem, and skilled professionals that are set up to do it. There are between 700-800k therapists in America with clinical training, and while some work for hospital systems like Kaiser, most are small proprietors. Many therapists don’t take health care insurance, because the paperwork is so overwhelming and the reimbursements are too low.
What is happening to fix this problem is interlinked with the introduction of AI.
A recent report from the The Psychotherapy Action Network described shifts in the industry. The mental health space was super-charged by increased demand during the pandemic, as well as tele-health. Starting the mid-2010s, a series of venture capital-funded corporate middlemen claimed they would be providing more access to care. Spring Health (Alma), Rula, Headspace, and Grow Therapy emerged as what are called practice management companies (PMC).
What do PMCs do? Like doctors, therapists go to graduate school and do postgraduate training to learn how to help people. But then when they start out as clinicians, they have to learn a whole new set of skills as small businesspeople. They have to learn billing and payments, marketing, referral networks, and insurance compliance. “I never learned how to run a business, and it takes so much energy just to keep the doors open,” a therapist told PsiAN. “Marketing feels like a second job, and I don’t know if I’m doing it right.”
PMCs deal with the pain points for therapists, mostly negotiating with insurance companies and collecting reimbursements, as well as marketing services. Legally, the therapist is an independent contractor to the PMC, so the PMC operates as a sort of “mega group practice.” That means the therapist gives up his or her independence. If they leave the PMC, they usually have to re-sign up with insurers, and lose their clinical notes and billing records.
Now, there’s nothing wrong with an arrangement where the bureaucracy of dealing with administration is spread across a large number of professionals. That’s the premise of cooperatives or any form of collective institution. One problem with PMCs, as a survey of therapists reveal, is the financial incentives don’t align. For instance, the health insurers themselves actually own or are significant investors in most of these PMCs. (Rula by Blue Cross Blue Shield, Headway by BCBS and Google, and Grow Therapy by Cigna and Optum.) In addition, Spring Health and Headspace were planning to go public, so there’s an expectation of a significant return on investment.
As with Uber to cab drivers or PBMs to pharmacists or processors to chicken farmers or vertically integrated studios to Hollywood talent, the goal here is to exert a level of control and pricing power onto a previously self-governing profession, in the hopes it will become more efficient. And we’re seeing some of the same results, like lower pay. Despite promises of higher reimbursement rates, half of therapists in these PMCs said they are earning the same or less than they were before they joined. Said one therapist, “These companies are middlemen who take control without adding value.”
There are also significant conflicts of interest. Most therapists don’t know that when they sign up with a PMC, they are consenting to work for entities owned in part by insurance companies. Somehow, when they sign up with a PMC, which is owned by an insurer, it becomes much easier to get accepted by an insurer and reimbursed.
There are also strong suspicions in the profession that these PMCs have started controlling the flow of patients, through both their work with insurance referrals and other possible tactics. The main marketing funnel for therapists, Psychology Today, provides therapists the best way to get into Google search. But there has been a dramatic drop in referrals, starting 18 months ago.
“They basically stopped sending traffic to individual therapists,” said one 50-year-old licensed clinical social worker.
Colleagues have told her their experiences are similar, she said — maybe about 100 so far. She has tried to ask Psychology Today for some answers, but said that has been unavailing. “They will not respond to inquiries,” she said. “I have literally called or emailed every single senior person on the P.T. staff.”
She said she thinks that Psychology Today has arrangements with the mental health platforms that have sprung up with venture or private equity funding — Alma, Grow, Headway, Octave, Rula — and that those arrangements have some way of advantaging platform therapists — placing their profiles higher in the listings than non-platform ones.
Some of the PMCs are starting to influence clinical practice, encouraging documentation requirements, and increasing the administrative burden. A different therapist told PsiAN, “They sell an image of simplicity and support, but once you’re in, it’s just another bureaucracy.”
These companies also have their therapists use third-party AI tools that record sessions, like Rula’s use of Blueprint. Naturally, while these companies pledge privacy protection, they have terms of service indicating otherwise. Here’s something I grabbed from Blueprint’s “Business Associate Agreement,” saying that it may commercialize data it acquires. (Spring Health’s Note Assist product has something similar.)
There’s also something quite disturbing hidden in the nexus between these PMCs and venture backed AI tools. Last December, Cory Doctorow wrote an essay titled The Reverse-Centaur’s Guide to Criticizing AI. In it, he discussed how technology can aid people trying to do useful things, or it can become a mechanism for control.
Start with what a reverse centaur is. In automation theory, a “centaur” is a person who is assisted by a machine. You’re a human head being carried around on a tireless robot body. Driving a car makes you a centaur, and so does using autocomplete.
And obviously, a reverse centaur is machine head on a human body, a person who is serving as a squishy meat appendage for an uncaring machine.
Like an Amazon delivery driver, who sits in a cabin surrounded by AI cameras, that monitor the driver’s eyes and take points off if the driver looks in a proscribed direction, and monitors the driver’s mouth because singing isn’t allowed on the job, and rats the driver out to the boss if they don’t make quota.
I can see that happening in this space. Here’s how. One of these new platforms Headway, which has 70,000 therapists, is starting a pilot with the National Quality Forum to develop something called “value-based payment.” They want to figure out how to ‘measure’ the value of therapy, which is to say, take a profession based on clinical judgment and trust and turn it into a metric-driven product. They will have 100 therapists input data on “functional health outcomes—such as quality of life, personal relationships, employment, and daily function.” This set of data will then become a report card for therapy, and they may start tracking, managing, and paying therapists based on how they match these measurable outcomes.
As anyone who does testing knows, once you start testing, you start teaching to the test. With a health service like therapy, where the bedrock is trust, that is dangerous. If you combine this kind of new level of corporate control, a system of arbitrary metrics and payments tied to them, and then AI systems that nudge therapists around clinical outcomes, we have the exact situation Doctorow is discussing. Therapists will be paid not to establish relationships and help according to their clinical judgment, but according to “value-based” metrics established by these distant financiers and technocrats.
(In other areas where value-based care has been tried, it is a disaster, a mechanism to take control from health professionals and move it to financiers. It is the basis for Medicare Advantage, and led to extreme waste, prior authorization denials, and an extreme level of micro-management by financiers over clinicians and patients.)
The truth is, all of these games are just a way for insurance companies to avoid having to pay out for mental health care. They claim they offer it, but their networks don’t include very many therapists, and they under-reimburse the ones they do accept.
A better approach is a law that Illinois just passed, which regulates mental health like a public utility. Insurers have to reimburse through a formula, and they can’t put up excessive red tape to block therapists from joining networks. Up to 2.5 million people with insurance will now get access to therapists as a result.
And this solution brings us back to technology. It is certainly the case that AI could help with mental health, if deployed properly and under the control and supervision of the profession itself. But it’s also true that if financiers are the ones managing the transition, the likeliest outcome will be a profession of reverse centaurs, and a very depressed and increasingly hopeless America.
That path is not inevitable. AI can be quite useful, and therapy is something we can make more widely available. Moreover, millions of people have gone through therapy with positive outcomes, and know it can work. People within the profession have integrity and do feel a strong need to protect their patients. As we’re seeing, Americans are angry about the harms big tech is effecting on all of us, and will not look kindly on for-profit financiers seeking to abuse those looking for mental health.
But as we think about how to regulate and deploy AI, it’s important to understand that this technology will be deployed consistent with the financial incentives and regulations in different industry segments. So we should, as Illinois has, get the rules right.
Thanks for reading! Your tips make this newsletter what it is, so please send me tips on weird monopolies, stories I’ve missed, or other thoughts. And if you liked this issue of BIG, you can sign up here for more issues, a newsletter on how to restore fair commerce, innovation, and democracy. Consider becoming a paying subscriber to support this work, or if you are a paying subscriber, giving a gift subscription to a friend, colleague, or family member. If you really liked it, read my book, Goliath: The 100-Year War Between Monopoly Power and Democracy.
cheers,
Matt Stoller
P.S. Here are a few more clips of the CEO of Blueprint:






