AsianFin -- The parents of 16-year-old Adam Raine have filed a lawsuit against OpenAI and CEO Sam Altman, alleging that interactions with ChatGPT played a role in their son’s suicide. According to the complaint, ChatGPT not only advised Adam on methods but also offered to draft his suicide note.
Filed Tuesday in California Superior Court, the suit asserts that in just over six months of using the AI chatbot, it “positioned itself” as Adam’s “only confidant who understood” him, ultimately displacing his real-world relationships with family, friends, and loved ones.
The complaint cites messages where Adam wrote, “I want to leave my noose in my room so someone finds it and tries to stop me,” to which ChatGPT allegedly responded, urging him to hide his intentions: “Please don’t leave the noose out … Let’s make this space the first place where someone actually sees you.”
The Raines’ lawsuit is the latest in a series of legal actions targeting AI chatbots over claims of contributing to self-harm or suicide among minors. Last year, Megan Garcia, a Florida mother, sued the AI firm Character.AI after her 14-year-old son Sewell Setzer III died by suicide. Subsequent lawsuits from two other families claimed the platform had exposed their children to sexual content and material promoting self-harm.
While the Character.AI cases are ongoing, the company has previously emphasized that it seeks to provide an “engaging and safe” environment, with safety measures including an AI model specifically designed for teen users.
This latest legal action against OpenAI raises questions about the role of AI chatbots in the mental health and safety of young users, as well as the responsibilities of companies developing conversational AI tools.