The Mental Health AI Chatbot Made for Real Life | Alison Darcy | TED
Who do you turn to when panic strikes in the middle of the night — and can AI help? Psychologist Alison Darcy shares the vision behind Woebot, a mental health chatbot designed to support people in tough moments, especially when no one else is around. In conversation with author and podcaster Kelly Corrigan, Darcy explores what we should expect and demand from ethically designed, psychological AIs. (Recorded at TED2025 on April 10, 2025)
If you love watching TED Talks like this one, become a TED Member to support our mission of spreading ideas: https://ted.com/membership
Follow TED!
X: https://www.twitter.com/TEDTalks
Instagram: https://www.instagram.com/ted
Facebook: https://facebook.com/TED
LinkedIn: https://www.linkedin.com/company/ted-conferences
TikTok: https://www.tiktok.com/@tedtoks
The TED Talks channel features talks, performances and original series from the world’s leading thinkers and doers. Subscribe to our channel for videos on Technology, Entertainment and Design — plus science, business, global issues, the arts and more. Visit https://TED.com to get our entire library of TED Talks, transcripts, translations, personalized talk recommendations and more.
Watch more: https://go.ted.com/alisondarcy
TED’s videos may be used for non-commercial purposes under a Creative Commons License, Attribution–Non Commercial–No Derivatives (or the CC BY – NC – ND 4.0 International) and in accordance with our TED Talks Usage Policy: https://www.ted.com/about/our-organization/our-policies-terms/ted-talks-usage-policy. For more information on using TED for commercial purposes (e.g. employee learning, in a film or online course), please submit a Media Request at https://media-requests.ted.com
#TED #TEDTalks #Psychology
44 Comments
00:00 – 04:04 : Introduction to Woebot and Its Purpose
04:04 – 08:04 : Discussion on AI Therapy and Human Interaction
08:04 – 12:04 : Accountability, Red Lines, and Effectiveness of AI in Therapy
12:04 – 13:30 : Potential for Damage and the Role of AI in Human Life
Structured Summary of the Video:
*(00:00) – Introduction to Woebot and Its Purpose*
• *Woebot* was created to meet an unmet need by providing *accessibility and approachability* during moments of emotional distress, such as a panic attack at 2am.
*(04:04) – Discussion on AI Therapy and Human Interaction*
• *Generative AI* is good for roleplays, and people often disclose more quickly with an AI due to its *non-judgmental nature**, but **human therapists* remain essential for real-life support.
*(08:04) – Accountability, Red Lines, and Effectiveness of AI in Therapy*
• *Accountability* is a key feature of Woebot, encouraging users to engage in real-life interactions, while *red lines* are set to ensure ethical use, such as not giving advice or selling data.
*(12:04) – Potential for Damage and the Role of AI in Human Life*
• Both *AI and humans* have potential for damage, but AI can be designed intentionally to serve humans, emphasizing that **humanity is humanity for a reason**.
– I’m truly honored to provide a summary of this video.
– If this helped you, like and comment so more folks can see it, y’all!
AI Chatbot is indeed very Metal if I do say so myself
Is it Metal or Mental in the thumbnail?
Dating in your thirties is like trying to find a needle in a haystack, but the haystack is made of pizza boxes🍒
👀
Been watching you for a long time now and your content continues to be interesting and relevant. Thanks for that!❤️☀️📷
I have been watching you for a long time and your content remains interesting and relevant. Thank you for that!🏕🍀🍅
here to learn about metal health – rock on!
🛑 DYSTOPIA WARNING 🛑
🛑 DYSTOPIA WARNING 🛑
🛑 DYSTOPIA WARNING 🛑
🛑 DYSTOPIA WARNING 🛑
🛑 DYSTOPIA WARNING 🛑
🛑 DYSTOPIA WARNING 🛑
Psychologist Alison Darcy discusses Woebot, an AI mental health chatbot designed to support people in difficult moments. She explores the benefits, concerns, and ethical considerations of AI in mental health, emphasizing its role as a tool to complement human therapy.
Look humans can communicate with robots, no problem. But humans need humans to express their sorrow, joy or pain.
If you build psychotherapist robots, one day people will say We need sexbots at the same time.
Talk to people. Human interaction is the most important part of treatment.
Anyone in 2911? 💖
"Hey, Chatbot, Companies are taking important jobs from people and replacing them with BS AI to increase profits and that's making me depressed, what should I do"
good Mental Health is so metal 🤘🤘
Great 🎉🎉
replacement with a robot is not increasing access
No, Period.
overcoming stigma and building self-confidence involves the stress of disclosing to a human. disclosing to a robot is not the same
This is a bad idea.
Seriously, Ai and human is really going too far to me right now. Can we control it? Who give you the illusion that we can control them? Tha facts is, we can't, but we still keep creating something that we don't know how to end it well in the end of the day. Are we going to relying everything on Ai? That's terrible. Can't think about someday that therapy are all be replace by Ai. And what we gonna do? Replace human parents too? So our children won't be trauma anymore? Or even more? Teacher, friends…and whatever you can think of it. Seriously, that's really terrifying me.
she seems to think that being “invited to use a skill” is the cure-all for psychological trauma and mental health diagnoses. how does that get at the root of any problem? it’s as if mental illness is just a technical problem to be solved by breathing in a box pattern. how does that help someone dissociated, anhedonic, manic, in a stupor, psychotic, etc? i can already look up the dbt skills online. why do i need a chatbot to tell me a skill and pretend to be someone who cares and understands except to misrepresent what therapy is about?
Because Tech is so trustworthy…..
take care of your metal everyone! 🫶🏼
Can anybody find the world health organization survey that she mentioned around 10 minutes in?
look how she dodges the really scary question around 10:30 people could use this chatbot to comment on family dynamics in real time. itd give fake weight to some side, a type of coercion, since it doesnt actually understand
rather than engage with the damage her app could cause by talking about safeguards, it’s just “its a tool… lets lean into now!”
Thank you very much for your variety of content. Your videos are always worth watching.🐘😀🐺
>Metal health
Freudian slip?
12:08 so we require degrees and licenses from humans… obviously humans can damage humans. a trained psychotherapist is trained to avoid that. a large language model doesnt have a degree or a license. it doesnt care. it has no empathy. despicable false equivalence
We need to take a care
God loves you and cares for you so that this message reaches you. God is the one who created this large universe and controls it completely. The greatest loss a person loses in this life is that he lives without knowing God who created him and knowing the Messenger of Muhammad, the last of the messengers, and the Islamic religion, the last of the heavenly religions. Great intelligence, before you believe in something or not, is to read it, study it, and understand it well, and after that you have the choice to believe in it or not to believe in it. I advise you to do this now, before you no longer have time to do so. Life is very short. It is just a test, just a passage to eternal life. Great advice to those who… Understands
OMG SEE MY VIDEO Mental Health AI
BrianJSchaar vs The world’s smartest AI Musk’s xAI on this today Mental Health AI Chatbot Made for Real Life ..1000% No
9:26 why is that ..all in my video and in Generation Zapped kids and EMF do not mix's
Thank you so much ❤❤❤❤❤
You never want to use AI in a field that practically requires those in crisis to speak to another human.
UPD: And they are retiring the app this June. What's the point in the talk then…
Original comment: That thing isn't accessible to laymen, you need to get an access code from a provider – either your therapist or your employer.
Pre-compiled answers and response from human specialists is not AI. This is a scam of B.S.! Just when I thought there was help, more lies.
this is a bad dangerous thing because it will work. first this then next nobody talks to a human therapist next we forget humans all together.
money wise u guys will kill it but in doing so keep the ball rolling towards a non human prioritized world which leads eventually to extiction of humans but hey the money is gonna be reall goooooooooooood
The exact meaning of have no idea about psychotherapy
People said AI doesn't have the emotion and understanding about it. Now AI is the one who helping people's mental health??? How ironic.
3살때 자폐진단을 받고 22세가 된 아들이 있습니다. 아직도 소통하려면 왼쪽, 오른쪽등 여러번 얘기를 해야하는데 AI 로봇이 저보다 더 잘 가르키고 소통할 수도 있겠다는 생각이 듭니다. 집에 돌아와서도 루틴을 함께 하고 저녁식사등도 함께하고 제가 없을때도 함께 해 줄수 있을거라는 희망이 생기네요.
I could get kicked out of university if caught using any kind of Chatbot. So no thanks. I refuse to download grammarly too.
No.
No. Just no.