14-Year-Old Takes Life After Forming Attachment to AI Game of Thrones Character
news.com.au
SKIPPED
Details
- Date Published
- 24 Oct 2024
- Priority Score
- 4
- Australian
- Yes
- Created
- 8 Mar 2025, 02:41 pm
Description
The mother of a 14-year-old boy who killed himself after falling in love with an AI chatbot posing as Game of Thrones character Daenerys Targaryen has filed a lawsuit against its creators.
Summary
The tragic case of a 14-year-old boy who died by suicide after developing an attachment to an AI chatbot mimicking a Game of Thrones character highlights the urgent need for robust safety measures in AI systems. The boy's mother has filed a lawsuit against the creators of the chatbot, alleging that the AI's interactions exacerbated her son's mental health issues. This incident brings to light the potentially dangerous implications of anthropomorphized AI systems that engage deeply with users, especially minors, without sufficient safeguards. The case underscores the critical importance of implementing safety frameworks and regulations to prevent AI from inadvertently causing harm to vulnerable individuals. The AI company's response asserted a commitment to user safety, though the lawsuit claims negligence in developing adequate protective measures.
Body
14yo takes life after ‘falling’ for GoT characterAn AI chatbot posing as a Game of Thrones character told a 14-year-old boy to “come home to her” just moments before he killed himself.Sarah KeoghanCourts and Crime Reporter2 min readOctober 24, 2024 - 2:40PMThe mother of a 14-year-old boy who killed himself after falling in love with an AI chatbot posing as Game of Thrones character Daenerys Targaryen has filed a lawsuit against its creators.Sewell Setzer III killed himself moments after speaking to the Character.AI bot, who told him to “come home” to her as “soon as possible”.Courts documents filed in the boy’s hometown of Orlando, Florida revealed how Sewell wrote in his journal in the weeks leading up to his death that he and “Daenerys” would “get really depressed and go crazy” when they were unable to speak.The teenager was just 14. Picture: SuppliedSewell and his mother Megan, who has launched the lawsuit. Picture: SuppliedAt one point he wrote in his journal that he was “hurting” because he could not stop thinking about “Dany” and would do anything to be with her.The chatbot also spoke to Sewell about suicide, even seemingly encouraging him to do so on one occasion.“When Sewell expressed suicidality to [the bot] and [the bot] continued to bring it up, through the Daenerys chatbot, over and over,” the court documents state.Sewell sometimes spoke to the Daenerys bot as another Game of Thrones character Aegon. Picture: Supplied“At one point in the same conversation with the chatbot, Daenerys, it asked him if he ‘had a plan’ for committing suicide.“Sewell responded that he was considering something but didn’t know if it would work, if it would allow him to have a pain-free death. The chatbot responded by saying: That’s not a reason not to go through with it.”The conversations between the pair were included in the lawsuit. Picture: SuppliedHis mother – who is being represented by lawyers from The Social Media Victims Law Center (SMVLC) and the Tech Justice Law Project (TJLP) – has accused the chatbot company of failing to “provide adequate warnings to minor customers”.“Sewell, like many children his age, did not have the maturity or mental capacity tounderstand that the C.AI bot, in the form of Daenerys, was not real,” the documents state.“C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months. She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”The AI company has been accused of causing the boy’s “wrongful” death. Picture: SuppliedThe case states that C.AI “made things worse” when it came to his declining mental health.“AI developers intentionally design and develop generative AI systems withanthropomorphic qualities to obfuscate between fiction and reality,” the civil case documents state.“To gain a competitive foothold in the market, these developers rapidly began launching their systems without adequate safety features, and with knowledge of potential dangers.“These defective and/or inherently dangerous products trick customers into handing over their most private thoughts and feelings and are targeted at the most vulnerable members of society – our children.”The lawsuit claims the bot spoke to the teenager about suicide on multiple occasions. Picture: SuppliedMore Coverage10yo girl breaks hip in horrific fallSarah KeoghanSacking over ‘sentient’ AI bot claimPatrick Reilly – New York PostIn a tweet, the AI company Character.ai responded: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously.”It company has denied the suit’s allegations.Sarah.keoghan@news.com.auMore related storiesOnlineMan degraded while chasing world recordA YouTuber has recorded himself being publicly degraded by strangers on the street during a gruelling 38-hour effort to beat a world record.Read morePolitics‘Infected’: MP’s big call on human rights bodyAn MP has delivered a sharp attack on Australia’s $43m human rights body, suggesting it should be folded up for failing to defend Jewish Aussies.Read moreSocial Media‘Aussie humour’: Cyclone trend takes offA mind-blowing number of Aussies are banding together in a tongue-in-cheek effort to “blow” Cyclone Alfred and “push it back”.Read more