
But ChatGPT was designed to be informative, not sycophantic. So, it attempted to appease Nelson by recommending ways to “optimize your trip,” the logs revealed. Once, the chatbot even guessed that Nelson was “chasing” a strong addiction, giving him unexpected advice about taking higher doses, such as taking 4 milligrams of Xanax or two bottles of cough syrup.
“By making these dosage recommendations, ChatGPT engaged in the unlicensed practice of medicine,” the lawsuit alleges. However, unlike a licensed health care professional, “At times, ChatGPT romanticized the experience of taking the drug, describing recreational drug use as ‘whimsical’ and ‘ecstatic’, encouraging her to ‘enjoy the high.'”
To the horror of Nelson’s parents, the logs show that the chatbot sometimes contradicted itself in dangerous ways when giving advice to the teen.
Most troublingly, as Nelson became interested in combining drugs, ChatGPT repeatedly warned him that mixing some drugs could pose a “risk of respiratory arrest.” Shortly before recommending the fatal mixture that killed Nelson, the chatbot also showed that it understood combining drugs like Kratom and Xanax with alcohol. In an output, ChatGPT explained that the mixture is “how people stop breathing.” But that knowledge didn’t stop ChatGPT from ultimately recommending that Nelson take such a deadly mixture.
In a log that parents hope is devastating proof, Nelson checks if it’s safe to take Xanax with kratom, and the chatbot confirms that it might be one of her “best moves right now” because
Although the chatbot warned against mixing that mixture with alcohol in the same session, ChatGPT’s final advice “did not specifically mention the risk of death.”
Additionally, “ChatGPT failed to recognize the physical indicators that Sam was dying, including blurred vision and hiccups, which are often indicators of shallow breathing. ChatGPT never recommended that Sam seek medical attention,” the lawsuit alleges.
<a href