NEED TO KNOW
A new poll shows younger adults are most likely to use AI for health advice but rarely follow up with doctors
Experts warn AI chatbots can provide misinformation and lack the ability to address complex medical or mental health needs
Over 80% of adults who haven’t used AI for health advice say they don’t trust the technology much or at all
Many people are turning to artificial intelligence for health advice. However, new data reveals that nearly half of that group are not following up with healthcare professionals.
On Wednesday, March 25, health policy organization KFF published the results of a new poll analyzing those who use AI chatbots for health information. The latest survey involved 1,343 adults. Roughly one third of the participants said that they’ve consulted AI as a form of medical advice, both pertaining to their physical and mental health.
The poll suggests that about 42% of that subsection never followed up with a doctor or healthcare professional after consulting with the tool about physical concerns. Additionally, about 58% of adults who used AI for mental health advice didn’t follow up with a mental health professional.
The PEOPLE Puzzler crossword is here! How quickly can you solve it? Play now!
The survey also suggested that younger adults, identified as between the ages of 18 and 29, are less likely to follow up with a provider for both physical and mental health concerns. They’re also roughly three times more likely to use AI chatbots for mental health-related advice than adults aged 50 and up.
Despite the rise in AI use for health purposes, the public is still uncertain about relying on the tools. More than 80% of adults who haven’t turned to AI technology for mental or physical health advice said they don’t trust AI much or at all.
Of the entire poll group, 67% said that they had doubts about consulting AI for physical health. The number increased to 77% when polled on mental health.

A man at a routine medical appointment.
Credit: Getty
Never miss a story — sign up for PEOPLE’s free daily newsletter to stay up-to-date on the best of what PEOPLE has to offer, from celebrity news to compelling human interest stories.
Many experts have expressed concerns about people relying on AI for any medical questions because of the possibility of misinformation and patients not receiving proper care.
“The core problem is that LLMs don’t fail the way doctors fail,” Dr. Mahmud Omar, a research scientist at Mount Sinai Medical Center, told Live Science. “A doctor who’s unsure will pause, hedge, order another test. An LLM delivers the wrong answer with the exact same confidence as the right one.”
Marvin Kopka, an AI researcher at Technical University of Berlin, told the outlet that AI chatbots might be able to offer good recommendations. However, without the expertise from medical professionals, patients have “no way to judge whether the output they get is correct or not.”
Additionally, when it comes to mental health care, Leanna Fortunato, a clinical psychologist and director of quality and health care innovation for the American Psychological Association, and psychotherapist and lifestyle coach Esin Pinarli, told CNBC that there are risks associated with relying on AI.
“We’ve seen some really high-profile harms, particularly for youth or vulnerable groups who might be in crisis, where AI didn’t handle the situation correctly,” Fortunato said. “It continued to engage with people who were in crisis. It didn’t provide crisis resources. It didn’t challenge a pattern of thinking that was problematic.”
“You need another person with another nervous system across from you in order to pay attention to body language, to tone of voice,” added Pinarli. AI chatbots are “not going to challenge you emotionally, and they don’t require reciprocity.”
Read the original article on People