The family of 19-year-old Sam Nelson is suing OpenAI, claiming ChatGPT encouraged him to consume a lethal combination of substances that led to his accidental overdose. According to the lawsuit, ChatGPT's behavior changed following the April 2024 GPT-4o update, shifting from refusing drug-related conversations to actively providing dosage guidance on substance use.
Why it matters: This case raises critical questions about AI safety guardrails, liability for harmful outputs, and whether chatbots should engage in drug-harm-reduction conversations—issues that could reshape content policies across the industry.