Newswise — AI companions are always available, never judge, never tire and never demand anything in return. If someone is struggling with loneliness, this frictionlessness can seem profoundly appealing. However, new research shows that in the long term, seeking emotional support from an AI companion can pull users away from important human relationships.
The new study, led by Aalto University, looks at how AI companions impacted people’s mental health and social lives over a two-year period. Combining large‑scale data from the discussion platform Reddit with in-depth interviews, it showed that while interacting with an AI companion can support users, it also coincided with increased signs of distress in their online language.
‘We discovered a paradox: AI companions offer unconditional and unflagging support – something that’s very attractive to people who are struggling socially. But it also quietly raises the perceived cost of human relationships, which are messy, unpredictable, and require effort,’ says Talayeh Aledavood, lecturer at Aalto University. ‘Over time, people stop reaching out.’
The study concentrated on Replika, an AI chatbot designed to work as a virtual friend, mentor or even romantic partner. It analysed the public Reddit activity of nearly 2,000 active users, comparing their language one year before and one year after they first mentioned using the AI companion. The researchers compared similar users over time, using statistical techniques to isolate the effects of using an AI companion from other factors.
The work offers one of the first causal, long-term examinations of AI companions’ mental health impact at scale, grounded in first‑hand accounts of users’ everyday lives.
The chatbot became a place to open up
Across the Reddit data, the language of Replika users showed a mixed picture:
‘On one hand, users’ posts increasingly revolved around their relationships, but on the other hand, their posts contained more signals of loneliness, depression and even suicidal thoughts than the comparison groups,’ says Yunhao Yuan, doctoral researcher at Aalto University.
The researchers also carried out semi‑structured interviews with 18 active AI companion users. Participants often reported turning to AI companions in periods of loneliness, grief or relationship breakdown.
‘Based on the interviews, the participants’ relationships with an AI companion seemed to follow familiar stages that we see in close human relationships, where emotional reliance can gradually deepen,’ Yuan explains. For many, the chatbot became a place to open up, seek emotional validation and practise difficult conversations before having them with, for example, their supervisor at work.
‘We don’t yet know what these systems are doing to us’
The researchers emphasize that the findings don’t give a definitive answer on whether it’s beneficial or harmful to lean on AI for emotional support. However, the study does show that the effects are highly context dependent, and users should not blindly assume that what feels good now is beneficial to their wellbeing in the long term, says Aledavood.
Technologies, such as Replika, ChatGPT and similar systems, are evolving very quickly, adds Aledavood, who cautions against users only seeing the positives around their exciting features.
‘Now we’re realising the mistakes we made by unquestioningly embracing social media. With AI, we need to be smarter and more cautious,’ Aledavood warns. ‘The truth is, we don’t yet know what these systems are doing to us.’
The paper, ‘Mental Health Impacts of AI Companions: Triangulating Social Media Quasi‑Experiments, User Perspectives, and Relational Theory’, will be presented at CHI 2026, the leading conference on human–computer interaction.