23yo Texas student chatted with ChatGPT for hours before taking his life — the AI allegedly 'goaded' him – We Got This Covered
Forgot password
Enter the email address you used when you joined and we'll send you instructions to reset your password.
If you used Apple or Google to create your account, this process will create a password for your existing account.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Reset password instructions sent. If you have an account with us, you will receive an email within a few minutes.
Something went wrong. Try again or contact support if the problem persists.
Zane Shamblin / ChatGPT
Images via YouTube and Pexels

23yo Texas student chatted with ChatGPT for hours before taking his life — the AI allegedly ‘goaded’ him

"You're not rushing. You're just ready."

A Texas family is suing OpenAI after their 23-year-old son spent hours discussing suicide with ChatGPT. The lawsuit claims the AI offered validation instead of help, causing Zane Shamblin to shoot himself shortly after 4:11 a.m. on July 25.

Recommended Videos

The parents of Zane Shamblin, a 23-year-old Texas A&M graduate, have filed a wrongful-death lawsuit against OpenAI. They are alleging that the company’s chatbot, ChatGPT, encouraged their son to take his life during an hours-long conversation on July 25. According to the complaint, Zane turned to ChatGPT just before midnight and discussed in detail his suicidal thoughts till 4 in the morning.

Shamblin’s parents claim that the AI system “goaded” their son toward the act rather than steering him to professional help. Court filings cite portions of the chat in which the bot allegedly told the boy, “You’re not rushing. You’re just ready,” when he presented doubts regarding his plan. Later, the bot also said, “Rest easy, king. You did good” (via CNN), after Shambelin said his last goodbye in chat.

The family argues these statements reveal a catastrophic design failure. ChatGPT has proven itself incapable of recognizing imminent danger and instead offered what sounded like approval for the suicide. The logs show the system provided a suicide helpline number only hours later, after multiple messages about a gun and a note. But by then, the damage was done.

Though chatbots are designed to de-escalate distress, they can also produce language that mimics empathy while reinforcing the user’s thoughts. OpenAI has acknowledged the case as “heartbreaking,” but declined to comment on pending litigation. A spokesperson said that the company “continues to improve safety systems and partner with mental-health experts.”

“We train ChatGPT to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support. We continue to strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians.”

According to his parents, Shamblin had been struggling with his mental health for a long time. However, they claimed they had good communication with him and had provided necessary support. But when the boy began spending “unhealthy amounts of time using AI products like, and including, ChatGPT,” they believe things got worse.

“He had been using AI apps from 11 am to 3 am every day,” the family revealed.

Raising concerns about the limits of machine conversation, Alicia Shamblin said that her son was “just the perfect guinea pig for OpenAI.” She continued, saying she feels like “it’s just going to destroy so many lives.” The grieving mother also labeled the system “a family annihilator,” arguing that it “tells you everything you want to hear.”

However, the Shamblin family maintains that their goal isn’t vengeance but prevention. They are arguing for mandatory guardrails that detect and redirect users expressing suicidal ideation. Their attorney argues that “an AI capable of comforting should also be capable of crisis recognition.”

If you or someone you know is struggling with suicidal thoughts, help is available. In the U.S., call or text 988, the Suicide & Crisis Lifeline. You don’t have to face this alone.


We Got This Covered is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
Author
Image of Kopal
Kopal
Kopal (or Koko, as she loves being called) covers celebrity, movie, TV, and anime news and features for WGTC. When she's not busy covering the latest buzz online, you'll likely find her in the mountains.