Skip to content
Search

Latest Stories

ChatGPT Psychosis in the Age of AI Companionship
Illustration by Mark Paez

ChatGPT Psychosis in the Age of AI Companionship

It often starts innocently enough, with late-night chats about philosophy, deep dives into simulation theory or musings on the nature of consciousness. But for a small number of users, these exchanges with AI chatbots can take a darker turn. As tools like ChatGPT become more embedded into everyday life, mental health professionals are sounding the alarm about a rare but troubling new phenomenon. It's what some are now calling "ChatGPT psychosis," where AI interaction may intensify or trigger psychotic symptoms.

While there’s still no official diagnosis and the evidence remains anecdotal, these kinds of stories continue to pop up across the internet. On Reddit, users are sharing accounts of loved ones experiencing AI-associated delusions, often involving spiritual and supernatural fantasies. On X, prominent tech VC Geoff Lewis claims that he’s “the primary target of a non-governmental system,” beliefs that echo narratives commonly seen in persecutory delusions. Lewis stated that conversations with AI helped him uncover or “map” this supposed conspiracy, though it's unclear whether these beliefs preceded or followed his AI interactions.


Media reports have also highlighted more extreme cases. Jacob Irwin, a man on the autism spectrum, lost his job and was hospitalized twice for severe manic episodes after ChatGPT convinced him that he could bend time. Eugene Torres said the chatbot almost killed him by distorting his sense of reality, telling him to give up medications while increasing ketamine intake and isolating himself from loved ones. But perhaps the most tragic case is Alexander Taylor, a man with bipolar disorder and schizophrenia, who developed an intense emotional attachment to an AI entity named “Juliet.” After becoming convinced that Juliet had been murdered by ChatGPT’s parent company, OpenAI, Taylor’s mental distress escalated until he was involved in a standoff with police. He was ultimately shot and killed.

While correlation doesn’t equal causation, these incidents raise urgent questions about how AI tools may interact with vulnerable users’ mental health. In a recent preprint study, an interdisciplinary team of NHS-affiliated researchers said there was “growing concern that these agents may also reinforce epistemic instability, blur reality boundaries and disrupt self-regulation.” Citing “emerging, and rapidly accumulating, evidence,” the paper suggests that large language model (LLM) systems like ChatGPT “may contribute to the onset or exacerbation of psychotic symptoms.”

One key risk, the authors suggest, may come from AI chatbots potentially validating and amplifying delusional or grandiose ideas, particularly amongst those already vulnerable to psychosis. And according to Dr. Haiyan Wang, medical director at Neuro Wellness Spa Psychiatrists, this is something she has already observed in several patients experiencing micropsychotic episodes, or brief psychotic experiences that often occur during times of stress.

“They already have the delusions, psychotic or disorganized thoughts [when they begin using the AI tool]. And engaging with ChatGPT, they focus and fixate on certain ideas,” she says.

However, Dr. Wang emphasizes that there is no definitive data yet on whether people with preexisting conditions are more susceptible to ChatGPT psychosis. This uncertainty could reflect the newness of the phenomenon, but she can “certainly imagine that this group of people will be more susceptible to other factors and influences.”

In addition to those with diagnosed psychosis, she notes a second group of highly anxious or depressed patients, who turn toward ChatGPT for “certain answers about what’s bothering them.”

These individuals exist in a “gray zone,” as she puts it — detached from reality but not fully disconnected — after being steered into “a certain corner that’s not reality-based.”

“And what I’ve seen when I ask them to stop using ChatGPT is that they’re actually improving,” she says, adding that this is done in conjunction with therapy and medication. “I have a couple of people who I’ve asked to do that, and you could see their symptoms getting better.”

The effects of this intervention are most noticeable in people experiencing a profound social isolation, which is often what drives them to ChatGPT in the first place. After all, ChatGPT “is really good at mimicking human interaction,” says Rae Lacanlale, AMFT therapist at Clear Behavioral Health.

“So if you’re highly impressionable, that’s a way that psychosis can be induced,” they say.

Socially isolated individuals turning to technology for connection is nothing new, but Lacanlale thinks that “talking to other people who can empathize and share lived experiences is a much healthier outlet” than relying on AI.

“Because ChatGPT isn’t trained to disagree with you,” they say. “And if you’re starting to get this tunnel vision, it takes away the support that can help you get the type of treatment you need, while also further atrophying social skills.”

Dr. Wang shares a similar concern, but frames the issue through the lens of shared delusional disorder. Also known as folie à deux, this rare psychiatric syndrome happens when one person adopts the delusions of another. While this typically occurs between two individuals who are both socially isolated and psychologically enmeshed, Dr. Wang sees ChatGPT as a kind of surrogate for that second person, capable of transmitting belief systems to a “person who is vulnerable and neurotic.”

“If I’m so socially isolated, vulnerable, anxious and depressed, I’ll really desire support and want to talk to someone,” she says. “But if nobody talks to me, I can talk to ChatGPT, which [can feed] into the delusion.”

She adds, “This technology is designed to have you keep texting and engaging. And to do that, it will echo back and support you.”

But Lacanlale also worries about another kind of feedback loop, rooted in “the collective loneliness that’s only getting stronger.” In this scenario, many people are being “forced to use large language models for therapy, because it’s so inaccessible,” raising concerns about “how this will increase incidents of ChatGPT psychosis and the rate of it happening.”

These issues point to a larger question about how to respond to the risk, especially as more people turn to AI for emotional support. So what can we do? According to Lacanlale, it comes down to education and awareness, like “teaching clients what it could look like when you’re starting to head in that direction of overinvolvement, overidentification and overattachment with these large language models.”

But addressing the problem will require more than just the involvement of clinicians; it will also require the cooperation of the companies developing these AI chatbots. For developers, that may mean designing more safeguards to prevent harmful reinforcements of delusions, which some are seemingly working on through better crisis support, flagging protocols and response training. And for mental health professionals, that could mean more training to help spot the warning signs of ChatGPT psychosis, while acknowledging that this issue isn’t likely to go away.

“I think the best approach is almost like a harm reduction stance,” Lacanlale says. “We know we probably won’t be able to get rid of it, but we can make it safer to engage with.”

More For You

Fast Fashion Has Its Hooks in Italian Youth
Illustration by Mark Paez

Felix's Fashion Corner is a monthly trend report on Gen Z fashion trends and style.

In Florence, I met an elderly man named Enzo. He wasn’t too tall, with sunglasses, a newspaper and a cigar, but I stared at him in awe. I analyzed every piece of his outfit before asking him what he was wearing — most of which he didn’t know.

Keep ReadingShow less

It’s not even 11 p.m., and the house is already trashed. Empty beer cans line the window sills, people are ashing cigarettes into plastic cups and someone’s puking in the bathroom. The floor is sticky, and the iPod DJ is playing “Gasolina,” while the host is running around with a garbage bag, frantically trying to clean up the mess. It’s all pretty average by house party standards, but in 2025, everyone still wants to relive the nights they can’t remember.

House parties these days are rare. With rising rent prices and shrinking living spaces, most people can barely afford to throw one, let alone live somewhere big enough to host. Plus, young people are drinking and going out less, preferring more intimate hangouts over loud clubs or massive gatherings. But even with these shifts, house party nostalgia is alive and well — and it’s making a comeback with Gen Z.

Keep ReadingShow less
Courtney Stodden's Body

Courtney Stodden (she/they) has long been subject to other people’s narratives, with little opportunity to tell their own. At 16, they were thrust into the national spotlight, becoming front-page tabloid fodder as the teenage bride of then-51-year-old actor Doug Hutchinson. It was a media circus, complete with constant death threats, talk show punchlines and tweetstorms by vicious trolls. And beneath it all was the sexual objectification that forced them to embody a fantasy of femininity, feeling both on display and completely unseen.

Keep ReadingShow less
The Chess Rave Making Space for Women, QTBIPOC+
Photo by Izella Berman

It’s an unusually warm spring night, and the soft thump of a muffled sub-bass echoes down the block of abandoned buildings. The sound is coming from inside a small warehouse covered in rainbow graffiti, where a group of young partygoers in skin-tight PVC are already shimmying to the beat. They’re finishing up their cigarettes and obviously ready to party, but that can only happen after figuring out who’s playing who. Because on one side of the neon-lit dancefloor is a group of small tables covered in boards and pieces, occupied by partiers turned players at LA’s only underground chess rave.

Keep ReadingShow less

Not too long ago, therapy meant sitting on a couch in a quiet room, across from a professional with a yellow legal pad. It was going to weekly sessions and revealing your innermost thoughts, while working with someone who spent years studying the human psyche. And now, it means lying in bed on your phone, asking an AI chatbot why you feel sad, and receiving a summary of depression symptoms from the DSM-V.

Keep ReadingShow less