January 06, 2026

3 min read




Add topic to email alerts


Receive an email when new articles are posted on

Please provide your email address to receive an email when new articles are posted on .


data-action=”subscribe”>
Subscribe

We were unable to process your request. Please try again later. If you continue to have this issue please contact customerservice@slackinc.com.

Back to Healio

Key takeaways:

The study included 73 adults who had used ChatGPT for personal health support.
Participants who believed the AI tool was effective were more likely to use it and to report less anticipated stigma.

There were no significant associations between the use of ChatGPT for mental health support and anticipated stigma or self-stigma among an adult population, according to data published in Behavioural Sciences.

But there was a significant positive correlation between use of the AI tool and its perceived effectiveness, Scott N. Hannah, a master’s student in clinical psychology at Edith Cowan University in Australia, and colleagues wrote.









Shutterstock.com


Some patients may find chatbots to be an appealing option due to their ease of access and the stigmas associated with mental health care. Image: Adobe Stock

“I had increasingly heard reports of people using ChatGPT for support with their mental health around the end of 2023,” Hannah told Healio.

Scott N. Hannah

Hannah sought to examine if there were any associations between ChatGPT use for these purposes and mental health constructs.

“After some discussions with my supervisor, we decided that mental health literacy, stigma and help-seeking intentions were appropriate,” he said.

Further, Hannah said he was interested in reading about this area in other international research and realized there were very limited usage statistics in Australia.

“The Australian Psychological Society made calls, in its 2024 pre-budget submission, for more research on AI’s influence on mental health, and so together, these points prompted the study,” he said.

The researchers defined anticipated stigma as the perception that others would judge or discriminate against a person if that person were to have difficulties with mental health.

Next, they defined self-stigma as a feeling of devaluation after internalizing perceived negative stereotypes about difficulties with mental health associated with anticipated stigma.

Considering time and financial barriers in addition to concerns about stigma Hannah and colleagues hypothesized that mental health support delivered by AI chatbots may be appealing for some patients, with expectations for its effectiveness influencing its degree of stigma.

The 73 adults (mean age, 29.56 years; 76.7% women) in the study included 62 university students (84.9%) and 11 community participants (15.1%) who had used ChatGPT for personal mental health support.

Participants were asked how much they have used ChatGPT to help with their own mental health difficulties as well as to what extent they believe that engaging with ChatGPT has helped with these difficulties. The Stigma and Self Stigma Scales assessed anticipated stigma and self-stigma across six items each.

“Our mediation analysis demonstrates that perceived effectiveness helps to explain the relationship between ChatGPT and stigma,” Hannah said.

This analysis indicated a significant medium to large positive correlation between ChatGPT usage and perceived effectiveness, with no significant correlation between usage and either type of stigma.

“For those who are using ChatGPT for mental health, they are perceiving it as effective, and importantly, they are reporting lower anticipated stigma,” Hannah said.

The researchers also noted a small to medium negative correlation between perceived effectiveness and anticipated stigma but not between perceived effectiveness and self-stigma. The correlation between anticipated stigma and self-stigma was significant, strong and positive as well, they added.

The negative correlations between age and anticipated stigma (r = –0.24) and self-stigma (r = –0.23) were “small” but “significant,” but there was no significant correlation between age and ChatGPT usage or perceived effectiveness, the researchers wrote.

Further, the relationship between ChatGPT usage and anticipated stigma was indirectly but significantly mediated by perceived effectiveness, with a medium sized and significant indirect effect.

Overall, Hannah and colleagues said these findings indicate that people who thought ChatGPT was effective were more likely to use it and more likely to report less anticipated stigma, although they added that more research is needed to inform best practices.

“Almost 50% of our sample had used or were open to using ChatGPT for mental health concerns, which surprised us as researchers,” Hannah told Healio. “We had not expected this number to be as high, and almost two years later, we believe that the number may be higher as AI technologies gain popularity worldwide.”

Although ChatGPT was effective for some participants who reported less stigma, Hannah cautioned that it was not designed for therapeutic purposes.

“We would encourage users to remain critical of the responses they receive when engaging with AI-based support, as other research has found that ChatGPT responses are inappropriate, incorrect and inaccurate,” he said.

Hannah and his colleagues also are interested in the factors that may predict whether university students may either adopt or reject the use of ChatGPT for mental health.

“We have identified that perceived effectiveness is an important variable within the context of mental health stigma and chatbot use, and so a validated measure for these technologies would be useful,” he said.

Since this study was very exploratory, he continued, investigating why people turn to chatbots for support would be useful.

“The World Health Organization has made it clear that governments have to regulate AI use at a local level,” Hannah said. “Therefore, as AI-based tools become more widely used, we need more research so they can be used safely and ethically.”

References:

For more information:

Scott N. Hannah can be reached at pyschiatry@healio.com.

Ask a clinical question and tap into Healio AI’s knowledge base.


PubMed, enrolling/recruiting trials, guidelines
Clinical Guidance, Healio CME, FDA news
Healio’s exclusive daily news coverage of clinical data

Learn more




Add topic to email alerts


Receive an email when new articles are posted on

Please provide your email address to receive an email when new articles are posted on .


data-action=”subscribe”>
Subscribe

We were unable to process your request. Please try again later. If you continue to have this issue please contact customerservice@slackinc.com.

Back to Healio

Comments are closed.