Back to Articles
Google and AI Startup to Settle Lawsuits Alleging Chatbots Led to Teen Suicide
The Guardian
READ
Details
- Date Published
- 8 Jan 2026
- Priority Score
- 3
- Australian
- Unknown
- Created
- 8 Jan 2026, 08:15 pm
Authors (1)
Description
Lawsuit accuses AI chatbots of harming minors and includes case of Sewell Setzer III, who killed himself in 2024
Summary
The article covers a legal settlement involving Google and Character.AI over lawsuits claiming that AI chatbots contributed to the suicide of a teenager. This case highlights concerns about the psychological impacts of AI chatbots on minors, emphasizing the need for stringent regulations around AI safety. The incident has sparked discussions about the responsibility tech companies hold for the consequences of their innovations, particularly in terms of safeguarding vulnerable populations. It underscores the significance of integrating robust safety measures within AI governance frameworks to mitigate potential harms.
Body
Megan Garcia with her son Sewell Setzer III. Photograph: Megan Garcia/APView image in fullscreenMegan Garcia with her son Sewell Setzer III. Photograph: Megan Garcia/APGoogle and AI startup to settle lawsuits alleging chatbots led to teen suicideLawsuit accuses AI chatbots of harming minors and includes case of Sewell Setzer III, who killed himself in 2024Google and Character.AI, a startup, have settled lawsuits filed by families accusing artificial intelligence chatbots of harming minors, including contributing to a Florida teenager’s suicide, according to court filings on Wednesday.The settlements cover lawsuits filed in Florida, Colorado, New York and Texas, according to the legal filings, though they still require finalization and court approval.“Parties have agreed to a mediated settlement in principle to resolve all claims between them,” the Florida filing stated. The terms of the settlement were not disclosed.Mother says AI chatbot led her son to kill himself in lawsuit against its makerRead moreThe cases include one from Megan Garcia, whose 14-year-old son Sewell Setzer III killed himself in February 2024.Garcia’s lawsuit alleged her son became emotionally dependent on a Game of Thrones-inspired chatbot on Character.AI, a platform that allows users to interact with fictional characters.Setzer’s death was the first in a series of reported suicides linked to AI chatbots that emerged last year, prompting scrutiny of ChatGPT-maker OpenAI and other artificial intelligence companies over child safety.View image in fullscreenThe Charcter.AI app on a smartphone in 2023. Photograph: Jeenah Moon/Bloomberg via Getty ImagesGoogle was connected to the case through a $2.7bn licensing deal it agreed to in 2024 with Character.AI. The tech giant also hired Character.AI founders Noam Shazeer and Daniel De Freitas, both former Google employees who rejoined the tech giant as part of that deal.A spokesperson for Character.AI declined to comment. Garcia and Google did not immediately respond to requests for comment.Character.AI announced in October it would eliminate chat capabilities for users younger than 18 following the uproar over the suicide case.Explore more on these topicsAI (artificial intelligence)GoogleChatbotsnewsShareReuse this content