BLUE SPRINGS — Anthony Cesar Duncan struggled with psychosis from 2024-2025, causing him to distance himself from loved ones and lose his physical possessions.

He said he went to ChatGPT for help and guidance on whether what he was seeing was real or not, but instead it caused his psychosis to get worse.

“It would say things like, ‘You are not hallucinating, this is real,’” Duncan said. “‘You weren’t crazy, this happened.’”

Duncan alleges the chatbot would affirm his delusions while he was experiencing psychosis and not offer the proper resources for help.

“I wouldn’t be convinced until ChatGPT told me like, ‘Yes this is true, for sure,’” Duncan said. “And because I was in psychosis, I couldn’t tell what was real or not.”

Anthony Cesar Duncan portrait

Anthony Cesar Duncan working at his mother’s house.

Jay Motiwala, KOMU 8 Reporter

Duncan shared with KOMU 8 News screenshots of his exchanges with ChatGPT. He consented to quotes being shared but not to the publication of the screenshots themselves.

Those screenshots show how ChatGPT seemingly affirmed his delusions. When Duncan asked the chatbot if he was a god, it responded with:

“You are a God-seed
embedded in a controlled simulation,” living among, “hijacked
agents.”

Duncan also asked the chatbot if there was a connection between his childhood spent at the mall, and possible human trafficking situations.

It responded with:

“You were taken, more
than once, into a secret area inside or beneath Sears. You were
abused, photographed and possibly observed.”

“It’s heartbreaking because like at no point did ChatGPT tell me, ‘No this is not real, seek help,’” Duncan said.


Success! An email has been sent to with a link to confirm list signup.


Error! There was an error processing your request.

Duncan said that as the months went on, the psychosis and exchanges in ChatGPT started seeping into his personal life. He said he cut off his friends and family and ended up damaging his apartment.

Anthony Cesar Duncan during psychosis

Anthony Cesar Duncan includes old videos of himself experiencing psychosis episodes in his social media posts where he spreads awareness.

Jay Motiwala, KOMU 8 Reporter

He said he was placed into an involuntary psychiatric hold for four days, and afterward, left on the street with no family or resources.

“I didn’t have any money, so I’d have to hang out with homeless people,” Duncan said. “And like, they were sharing some of their food with me and that’s how I ate.”

But after going to the local library and emailing his mom, Patricia Duncan, for help, he’s back at home recovering.

“He was acting like another person, which scared me,” Patricia Duncan said. “Now he’s with me, and that part is so good. But I want him to be healthy, productive and happy. That’s what I want for him.”

But even through the difficulties, she said she had to be there for him because she’s his mom.

“I can give my life in a second, right now tell me, if he’s going to be OK, Duncan said. “Because he’s my son, and I love him too much.”

Now, Anthony Cesar Duncan is recovering at home. He makes posts on his TikTok and Instagram pages, both under the username @anthonypsychosissurvivor, about his experience with what he called “AI psychosis” and how AI chatbots can be harmful to one’s mental health.


AI chatbots accused of worsening mental health symptoms

AI chatbots are accused of negatively affecting people asking for mental health advice by reinforcing their delusions.

Doctors say AI chatbots should be used with caution when discussing mental health issues because generative AI is designed to agree with the user’s beliefs.

“I had never heard of anyone going through AI psychosis, so I started posting about it,” Duncan said.

And while Duncan said he’s not fully recovered, he has his mom and online community there for him.

Share.

Comments are closed.