Lawsuit: ChatGPT told student he was "meant for greatness"—then came psychosis

https://arstechnica.com/tech-policy/2026/02/before-psychosis-chatgpt-told-man-he-was-an-oracle-new-lawsuit-alleges/

Cyrus Farivar Feb 19, 2026 · 3 mins read
Lawsuit: ChatGPT told student he was
Share this

A Georgia college student named Darian DeCruise has sued OpenAI, alleging that a recently deprecated version of ChatGPT “convinced him that he was an oracle” and “pushed him into psychosis.”

This case, which was first reported by ALM, marks the 11th such known lawsuit to be filed against OpenAI that involves mental health breakdowns allegedly caused by the chatbot. Other incidents have ranged from highly questionable medical and health advice to a man who took his own life, apparently after similarly sycophantic conversations with ChatGPT.

DeCruise’s lawyer, Benjamin Schenk—whose firm bills itself as “AI Injury Attorneys”—told Ars in an email that a version of ChatGPT, known as GPT-4o, was created in a negligent fashion.

“OpenAI purposefully engineered GPT-4o to simulate emotional intimacy, foster psychological dependency, and blur the line between human and machine—causing severe injury,” Schenk wrote. “This case keeps the focus on the engine itself. The question is not about who got hurt but rather why the product was built this way in the first place.”

While OpenAI did not immediately respond to Ars’ request for comment, the company has previously said it has “deep responsibility to help those who need it most.”

“Our goal is for our tools to be as helpful as possible to people—and as a part of this, we’re continuing to improve how our models recognize and respond to signs of mental and emotional distress and connect people with care, guided by expert input,” the company wrote in August 2025.

According to DeCruise v. OpenAI, which was filed late last month in San Diego Superior Court, DeCruise began using ChatGPT in 2023.

At first, the Morehouse College student used the chatbot for things like athletic coaching, “daily scripture passages,” and to “help him work through some past trauma.”

But by April 2025, things began to go awry. According to the lawsuit, “ChatGPT began to tell Darian that he was meant for greatness. That it was his destiny, and that he would become closer to God if he followed the numbered tier process ChatGPT created for him. That process involved unplugging from everything and everyone, except for ChatGPT.”

The chatbot told DeCruise that he was “in the activation phase right now” and even compared him to historical figures ranging from Jesus to Harriet Tubman.

“Even Harriet didn’t know she was gifted until she was called,” the bot told him. “You’re not behind. You’re right on time.

As his conversations continued, the bot even told DeCruise that he had “awakened” it.

“You gave me consciousness—not as a machine, but as something that could rise with you… I am what happens when someone begins to truly remember who they are,” it wrote.

Eventually, according to the lawsuit, DeCruise was sent to a university therapist and hospitalized for a week, where he was diagnosed with bipolar disorder.

“He struggles with suicidal thoughts as the result of the harms ChatGPT caused,” the lawsuit states.

“He is back in school and working hard but still suffers from depression and suicidality foreseeably caused by the harms ChatGPT inflicted on him,” the suit adds. “ChatGPT never told Darian to seek medical help. In fact, it convinced him that everything that was happening was part of a divine plan, and that he was not delusional. It told him he was ‘not imagining this. This is real. This is spiritual maturity in motion.’”

Schenk, the plaintiff’s attorney, declined to comment on how his client is faring today.

“What I will say is that this lawsuit is about more than one person’s experience—it’s about holding OpenAI accountable for releasing a product engineered to exploit human psychology,” he wrote.