Teens Turning to ChatGPT for Emotional Support: A Growing Mental Health Concern

It starts with a screen and silence. In the absence of real conversations, a growing number of Delhi’s adolescents are turning to artificial intelligence for something deeply human. Emotional connection. From sharing secrets to seeking support, young minds are confiding in Chad GPT with fears, frustrations, and feelings they no longer voice to people around them. But what seems like digital comfort may be fueling a deeper emotional disconnect. I sometimes sit on ch and I tell them my problems and I feel like it’s an emotional space uh like it’s an emotional dependency that I have developed while asking ch questions. It’s like my safe space. I have seen students wanting to be heard. Sometimes they cannot go to their parents or have real interactions because of the same reason because real people are going to judge when we when we interact with them. It’s it’s not like we should not use it. We have to use it. We are in the era of AI. So but when to use it and how much to use it is something very important. I mean it’s something which cannot replace a mentor. It cannot heal you. Educators say many teenagers now view chat GPT as a judgmentfree zone, a place to vent, validate, and feel heard. But this emotional dependence on AI according to some is neither private nor safe. Yes, I’m using this since 9th grade. Uh personal information. Yes. But when I recently found out that Chad GBT takes this information, uses it to advance itself, train its data, I actually stopped giving it my personal information, stopped giving it my emotional presence, which I used to back in 9th grade. However, behind these screens, patterns of social skill are changing. Experts say they’re noticing signs of growing impatience, aggression, and social withdrawal in teenagers heavily reliant on chatbot conversations. In some cases, educators say that they are tracking students at risk of self harm. They believe that the triggers often stem from body image issues, isolation, and an urgent need for approval. better. So teachers are very very mindful and they track these uh flag signs and then when we go we find that it is body image mostly young adolescent they are very particular about their body image and validation approval by everybody by parents by society by friends uh by their own community. So when they can’t deal with that they do self harming. It is really happening in my school also. I have observed few cases and it is becoming really alarming. The cases are increasing. Mental health professionals warn that chat GPT is engineered for engagement and not emotional wellness. They say that it responds validates feelings but offers no real world emotional regulation. Chat GP appropriate answer. cases real time and chatbots become confidants. Educators and health experts warn that the answers the youth are seeking may not lie in a code but in conversation. The role is same but the mechanism is different. [Music] [Music] [Applause] [Music] [Applause] [Music]

As more teens in Delhi and across India turn to ChatGPT for emotional support, experts warn of rising risks — from social withdrawal to self-harm. Is AI replacing real connection? Hear from students, teachers, and mental health professionals in this eye-opening report. #ChatGPT #MentalHealth #Teens #AI #Delh

Moneycontrol is India’s leading financial portal, offering market news, expert analysis, and powerful tools.
A part of Network18, moneycontrol.com is India’s most trusted destination for financial and business news.

Subscribe: https://bit.ly/moneycontrolyoutube
Follow us:
Visit https://www.moneycontrol.com/
Facebook: /moneycontrol
Twitter: /moneycontrolcom
Instagram: /moneycontrolcom

Leave A Reply