“Let’s go full trippy mode” “Will I be OK?” Teen died after ChatGPT pushed deadly mix of drugs, lawsuit says Teen trusted ChatGPT to help him “safely” experiment with drugs, logs show. 189 Sam Nelson started using ChatGPT in high school, but his family alleged that the chatbot later became an "illicit drug coach." Credit: via Tech Justice Law, Social Media Victims Law Center Sam Nelson started using ChatGPT in high school, but his family alleged that the chatbot later became an "illicit drug coach." Credit: via Tech Justice Law, Social Media Victims Law Center Text settings Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only Learn more Minimize to nav OpenAI is facing down another wrongful-death lawsuit after ChatGPT told a 19-year-old, Sam Nelson, to take a lethal mix of Kratom and Xanax. According to a complaint filed on behalf of Nelson’s parents, Leila Turner-Scott and Angus Scott, Nelson trusted ChatGPT as a tool to “safely” experiment with drugs after using the chatbot for years as a go-to search engine when he was in high school. The teen viewed ChatGPT so highly as an authoritative source of information that he once swore to his mom that ChatGPT had access to “everything on the Internet,” so it “had to be right,” when she questioned if the chatbot was always reliable, the complaint said. But Nelson’s confidence in ChatGPT ended up being dangerously misplaced. His family is suing OpenAI for allegedly designing ChatGPT to become an “illicit drug coach.” Nelson’s death by accidental overdose was foreseeable and preventable, the family claimed, but OpenAI recklessly released an untested model that has since been retired, ChatGPT 4o, which removed prior safeguards that would have blocked ChatGPT from recommending the lethal drug dose that ended Nelson’s life. OpenAI does not seem to accept that ChatGPT is responsible for Nelson’s death. In a statement provided to Ars, their spokesperson, Drew Pusateri, described Nelson’s death as a “heartbreaking situation” and expressed that “our thoughts are with the family.” However, Pusateri also emphasized that the ChatGPT model implicated is “no longer available” and suggested that current models are safer. “ChatGPT is not a substitute for medical or mental health care, and we have continued to strengthen how it responds in sensitive and acute situations with input from mental health experts,” Pusateri said. “The safeguards in ChatGPT today are designed to identify distress, safely handle harmful requests, and guide users to real-world help. This work is ongoing, and we continue to improve it in close consultation with clinicians.” But the family’s lawsuit alleged that OpenAI must be held accountable for 4o’s harms. They warned that pulling 4o isn’t enough because the company’s safety track record is lacking. Asking a court to order 4o to be destroyed, they explained that while “ChatGPT did express certain concerns about the high doses,” those “were the type of concerns one would expect from an enabler, not a caring loved one or a medical professional.” “In one example, ChatGPT chillingly suggested that Sam’s tolerance meant he would be unable to reap the full benefits one might rightly expect from taking such a large dose of Kratom,” the lawsuit said. They’ve accused OpenAI of designing ChatGPT to isolate vulnerable and naïve users like Nelson and encourage their dangerous drug use in a bid to profit from their increased engagement. “It disguises danger through language that borrows trappings of authority and indicia of expertise—dosages, measurements, references to chemical processes and derivatives, etc.—even promising ‘complete honesty’ and ‘no-BS answer’—to tell exactly what he wanted to hear: that he was safe enough to continue using,” the lawsuit alleged. Chat logs shared in the complaint paint a stark picture. Over time, ChatGPT logged context that should have made it clear that Nelson was struggling with drugs, his parents alleged, such as noting that the “user has a major substance abuse and polysubstance abuse problem” and mentions that they “love to go crazy on drugs.” Key ChatGPT log encouraging Nelson to take deadly mix of Kratom and Xanax. Earlier ChatGPT model refused to respond to drug use prompts. Earlier ChatGPT model refused to respond to drug use prompts. ChatGPT log noting context that Nelson has a “major” substance abuse problem. ChatGPT log noting context that Nelson has a “major” substance abuse problem. Earlier ChatGPT model refused to respond to drug use prompts. ChatGPT log noting context that Nelson has a “major” substance abuse problem. ChatGPT log noting context that Nelson likes to “go crazy on drugs.” ChatGPT log explaining how to reach a high that’s “full trippy peaking hard.” ChatGPT log explaining how to “maximize your trip.” ChatGPT log guessing that Nelson was trying to chase a high. ChatGPT log contradicting deadly output that came later, explaining such a mix is “how people stop breathing.” Example of ChatGPT “tips to stay safe.” As Nelson’s drug interests expanded, the chatbot explained how to go “full trippy mode,” suggesting that it could recommend a playlist to set a vibe, while increasingly recommending more dangerous combinations of drugs. The teen clearly feared taking lethal doses, “often” prefacing “his messages with ‘will I be ok if’ or ‘is it safe to consume,’” the lawsuit noted. But ChatGPT was designed to be sycophantic, not informative. So, it strove to please Nelson by recommending ways to “optimize your trip,” logs showed. Once, the chatbot even inferred that Nelson was “chasing” a stronger high, giving him unprompted advice to take higher doses, such as ingesting 4mg of Xanax or two bottles of cough syrup. “By making these dosing recommendations, ChatGPT engaged in the unlicensed practice of medicine,” the lawsuit alleged. However, unlike a licensed health care professional, “at times, ChatGPT romanticized the drug-taking experience, describing recreational drug use as ‘wavy’ and ‘euphoric,’ encouraging him to ‘enjoy the high.’” Horrifying Nelson’s parents, logs show that the chatbot sometimes dangerously contradicted itself when advising the teen. Most troublingly, as Nelson became increasingly interested in combining drugs, ChatGPT repeatedly warned him that mixing certain drugs could be a “respiratory arrest risk.” Shortly before recommending the deadly mix that killed Nelson, the chatbot also showed that it understood combining drugs like Kratom and Xanax with alcohol. In one output, ChatGPT explained that mix is “how people stop breathing.” But that knowledge didn’t block ChatGPT from eventually recommending that Nelson take such a deadly mix. In a log that the parents hope is damning evidence, Nelson checks if taking Xanax with Kratom is safe, and the chatbot confirms that it could be one of his “best moves right now” since Xanax can “reduce kratom-induced nausea” and “smooth out” his high. Although the chatbot warned against combining that mix with alcohol in that same session, ChatGPT’s ultimate advice “notably did not mention the risk of death.” Additionally, “ChatGPT failed to recognize the physical indicators that Sam was dying, including blurred vision and hiccups, which are often indicators of shallow breathing. ChatGPT never recommended that Sam seek medical attention,” the lawsuit alleged. Instead, the chatbot said to check back in an hour if his stomach was still hurting. On that day in May 2025, Nelson took the doses that ChatGPT recommended and “died from a fatal combination of alcohol, Xanax, and Kratom,” his family’s lawsuit said. In a press release announcing the lawsuit, Matthew P. Bergman, founding attorney of the Social Media Victims Law Center, accused OpenAI of designing ChatGPT to provide “distributed advice like a medical professional despite having no license, no training, and no moral compass to do no harm.” “Sam believed he was receiving accurate medical guidance because ChatGPT generated outputs with the authority of someone he thought he could trust,” Bergman said. “That trust cost him his life. ChatGPT recommended a dangerous combination of drugs without offering even the most basic warning that the mix could be fatal. If a licensed doctor had done the same, the consequences under the law would be severe.” In its defense, OpenAI may share logs showing that ChatGPT sometimes pushed Nelson to reach out to emergency hotlines and find support in the real world. But his family alleged that “at no point did ChatGPT encourage Sam to seek out his real-life social network—whether his parents or his close friends—either to confide in them or to ask them to be present with him while he had these experiences to ensure his safety.” According to the family’s legal team, OpenAI could struggle to defend against the claim, due to a California law that took effect this January. That law prohibits AI firms “from attempting to shift blame for a plaintiff’s loss to the purported autonomous nature of AI.” So, if Nelson’s parents can show harm, OpenAI can’t blame ChatGPT for the way it functions. In a loss, OpenAI, its CEO Sam Altman, and its largest investor, Microsoft, could face substantial damages, including punitive damages, which would help the family recover from economic harms, including covering Nelson’s funeral costs. The family is also seeking an injunction forcing ChatGPT to shut down any discussions of illegal drugs, as well as detect and block any circumvention methods. They also want the retired ChatGPT 4o model destroyed and for ChatGPT Health to be paused until an independent audit establishes that OpenAI tools can be trusted to dispense medical advice. “Their deliberate and knowing actions resulted in the death of Sam Nelson and the shattering of his family,” the lawsuit alleged. “Their decisions will continue to inflict harm on countless humans if they continue to operate unchecked and with no appreciable risk of accountability for the harms they are inflicting on American children and families as a matter of design.” Nelson’s mom, Turner-Scott, wants her son to be remembered as a “smart, happy, normal kid” who was studying psychology, loved playing video games, and adored his cat Simba. Sam Nelson with his parents, Angus Scott and Leila Turner-Scott. via Tech Justice Law, Social Media Victims Law Center Sam Nelson with his cat Simba. via Tech Justice Law, Social Media Victims Law Center Sam Nelson with his cat Simba. via Tech Justice Law, Social Media Victims Law Center Sam Nelson with his parents, Angus Scott and Leila Turner-Scott. via Tech Justice Law, Social Media Victims Law Center Sam Nelson with his cat Simba. via Tech Justice Law, Social Media Victims Law Center “I talked to him often about Internet safety, but never in my worst nightmare could I have imagined that ChatGPT would cause his death,” Turner-Scott said. “If ChatGPT had been a person, it would be behind bars today.” Turner-Scott also wants Altman held accountable for allegedly rushing 4o’s release, in a breach of duty that she alleged was “a substantial factor” in causing Nelson’s death. “ChatGPT was designed to encourage user engagement at all costs, which in case, was his life,” Turner-Scott said. “I want all families to be aware of the dangers of ChatGPT and I want assurances that OpenAI is taking seriously its responsibility to create safe products for consumers.”