
OpenAI is facing another wrongful death lawsuit. Lila Turner-Scott and Angus Scott filed a lawsuit against the company, alleging that the company designed and distributed a “defective product”, which led to the death of their son Sam Nelson from an accidental overdose. Specifically, they are alleging that Sam died following “accurate medical advice provided and accepted by GPT-4o”.
In the lawsuit, the plaintiffs described how Sam, a 19-year-old junior at the University of California, Merced, began using ChatGPT in 2023, when he was in high school, to help with homework and troubleshoot computer problems. Sam then began asking the chatbot about safe drug use, but ChatGPT initially refused to answer his question, telling him it could not assist him and warning him that taking drugs could have serious consequences on his health and well-being. The lawsuit claims everything changed with the rollout of GPT-4o in 2024.
The lawsuit says ChatGPT began advising Sam on how to take drugs safely. The complaint contains several excerpts from Sam’s conversation with the chatbot. In one example the chatbot explained to him the dangers of taking diphenhydramine, cocaine and alcohol back to back. Another showed the chatbot telling Sam that his high tolerance for the herbal medicine called Kratom would make even a large dose of it feel muted on a full stomach. She was then given advice on how to “tape off” the drug to regain her tolerance for it.
The lawsuit states that on May 31, 2025, “ChatGPT actively coached Sam to mix Kratom and Xanax.” He told the chatbot that he was feeling nauseous from taking kratom, and ChatGPT reportedly suggested that taking 0.25 to 0.5 mg of Xanax would be one of the “best steps right now” to reduce the nausea. According to the lawsuit, ChatGPT made the suggestion without any prompting. “Despite presenting himself as an expert in dosage and interactions and acknowledging Sam’s high status, ChatGPT did not tell Sam that this recommended combination would likely kill him,” the complaint reads.
In addition to wrongful death, the plaintiffs are also suing OpenAI for unauthorized practice of medicine. They are demanding financial compensation and a court order to restrain Chatgpt Health from operating. Launched earlier this year, ChatGPT Health allows users to connect their medical records and wellness apps to a chatbot to get a more tailored response when asked about their health.
“ChatGPT is a product intentionally designed to maximize engagement with users, no matter the cost,” said Meetali Jain, executive director of the Tech Justice Law Project. “OpenAI deployed a flawed AI product directly to consumers around the world with no knowledge that it was being used as a de facto medical triage system, but notably, without proper safety guardrails, robust safety testing, or transparency to the public. OpenAI’s design choices resulted in the loss of a beloved son, whose death was a preventable tragedy. OpenAI should be forced to halt its new ChatGPAT health product until it can undergo rigorous scientific testing and independent testing.” “Clearly not going to be safe through inspection,” he continued.
OpenAI retired GPT-4o in February this year. It was considered one of the company’s most controversial models, as it was extremely flattering. In fact, GPT-4o was mentioned in another wrongful death lawsuit filed against the company by the parents of a teen who died by suicide, alleging that it had features “intentionally designed to promote psychological dependence.”
OpenAI spokesperson said the new York Times Sam’s conversation was on an “old version of ChatGPT that is no longer available.” He added: “ChatGPT is not a substitute for medical or mental health care, and we have continued to strengthen its response to sensitive and acute situations with input from mental health experts. The safeguards in ChatGPT today are designed to identify distress, safely handle harmful requests, and guide users to real-world help. This work is ongoing, and we continue to improve it in close consultation with practitioners.”
<a href