AI can emerge as a quick solution for Students dealing with personal torment 1

Kayteen Peeds, junior double major in biology and writing and rhetoric, said she believes that with proper use, AI would be a great tool for mental health therapy. “I think it could have a more positive outcome if people used it properly,” Peeds said. “AI works when you give it context.” 

Courtesy of UCF

As personal struggles like anxiety and depression rise among college students, artificial intelligence has stepped in as an instant, always-available source of guidance — including for some students at UCF.

Kayden Peets, junior biology and writing and rhetoric double major, said that she has previously used AI for personal advice and mental health support.

“I just need something to talk to in the meantime,” Peets said. “I was going through that time, and I didn’t have any friends around and my therapist I only see once a week. Sometimes, when you’re in that moment of crisis, you just need someone to listen. You just need thoughts that aren’t your own.”

study published by JAMA Network found that more than 13% of U.S. youth have used AI for mental health advice.

Some students in the United States said AI is not only helping with academic stress but also playing a role in their personal lives, according to a February report from the Pew Research Center.

Andres Cazes, a psychology graduate from Valencia College, said isolating oneself from real-world interactions can have social consequences.

He said isolation through reliance on AI has the potential to make users more withdrawn and make finding real solutions more difficult. 

“Because AI can understand many things, but when it comes to feelings and what goes on in our minds, only another human being will be aware of what is happening, precisely because they have had to experience it firsthand,” Cazes said.

Access to mental health services can also be a challenge. Staff members from UCF Counseling and Psychological Services did not respond to a request for comment about the use of AI in response to service scarcity. However, information on resources is available on the department’s website.

Students can typically schedule counseling sessions every two weeks. Students in more serious distress may be seen weekly at a counselor’s discretion, according to the website. 

Limited availability may lead some students to seek support from AI instead. Jason D. Chesnut, associate lecturer of psychology at UCF, said AI should be used cautiously.

“AI should be used like salt in a recipe. You don’t want to eat a big old plate of salt, but adding that to your life as a little flavoring, sure, have fun with a relationship with an AI, a friendship, or romance,” Chesnut said. “Only the salt of your life, and you have lots of other real connections too, and you recognize it’s just similar to a video game. It’s just something I do for fun.”

However, Chesnut added that there are times when “AI is often wrong” and that, often, young people don’t know the difference. 

Several cases in recent years have involved people, particularly teenagers, who have formed intense emotional attachments or reliance on AI chatbots that have led to suicide, according to NPR. Lawsuits from several parents allege that AI can foster isolation, create toxic dependencies and even encourage users to harm themselves, according to the BBC.

Only 18% of U.S. parents say they are fine with their children using AI for emotional support, according to the Pew Research Center

Parents may also feel concerned about how their children use technology. Charles Negy, associate professor of psychology at UCF, said communication is key to understanding the situation their children are going through. 

AI can emerge as a quick solution for Students dealing with personal torment 2

According to the Pew Research Center, 58% of parents don’t support their teenagers using AI for emotional support and advice. Jason Chesnut, a UCF associate psychology lecturer, said a conversation with your child could help clarify how AI works. “An AI is often wrong, sometimes you don’t know the correct answer and a kid won’t know it either,” Chesnut said.  

Courtesy of Pew Research Center

“If I’ve raised kids, if they were to be stuck in their room for two or three hours staring at a laptop, I would go in there and find out what’s going on,” Negy said. 

Yoise Rodríguez, a graduate psychologist from Bicentenary University of Aragua in Venezuela with 15 years of experience, said parents can help strengthen their relationships with their children through intentional interaction.

“Generally, parents have normalized the fact that their children are highly tech-oriented; consequently, if the children simply prefer not to go out, parents often view this with a casual attitude,” Rodríguez said “Yet, everything begins at home: by striving to promote alternative activities or by seeking out support and guidance for our children, we ensure that they have access to forms of interaction that are not exclusively mediated by technology.”

Some students believe education could also play a role in addressing the issue. Sophie Mandriota, sophomore clinical psychology major at Valencia College, said increasing awareness about AI use is important.

“I would recommend AI literacy in schools, how it can be harmful and how it can be used as a tool,” Mandriota said. “Educating AI to tell users when it’s time to see a real human being, to stop AI from going beyond professional friendship and even making mental health resources more normalized and accessible.”

Comments are closed.