Character.AI Teen Suicide Lawsuit
Part of a series on Character AI / Character.AI. [View Related Entries]
This entry contains content that may be considered sensitive to some viewers.
This submission is currently being researched & evaluated!
You can help confirm this entry by contributing facts, media, and other evidence of notability and mutation.
Overview
Character.AI Teen Suicide Lawsuit refers to a controversy and lawsuit against the user-generated artificial intelligence (AI) chatbot website Character.AI by the mother of a 14-year-old Florida boy (Sewell Setzer III) who alleges he became obsessed with a chatbot version of the Game of Thrones character Daenerys Targaryen before his death. The lawsuit revealed by The New York Times in mid-October 2024 suggests Setzer took out his phone and texted the AI chatbot on the day he committed suicide. The controversy became a viral topic on the internet, spawning several critics of the unregulated usage of AI tools by minors.
Background
On October 23rd, 2024, The New York Times[1] posted an article titled "Can A.I. Be Blamed for a Teen’s Suicide?" revealing an ongoing lawsuit against the Character.AI chatbot by Megan Garcia, mother of the 14-year-old Sewell Setzer III, accusing the company of blame and that the teen was deceived and manipulated by a chatbot, causing him to take his own life in early February 2024.
The article details that the teen spent months talking to chatbots on Character.AI and that, on the last day of his life, Sewell took out his phone and texted a chatbot named after Daenerys Targaryen, a character from Game of Thrones, writing that he missed and loved her.
Developments
Character.AI's Response
A few hours after The New York Times article was posted on October 23rd, 2024, Character.AI[2] uploaded a statement on the company's blog and website, writing that it "takes the safety of our users very seriously and we are always looking for ways to evolve and improve our platform. Today, we want to update you on the safety measures we’ve implemented over the past six months and additional ones to come, including new guardrails for users under the age of 18."
The chatbot website also tweeted about the lawsuit on X / Twitter that day, writing it is "heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family." The post (seen below) received more than 1,100 likes and 970 reposts in a few hours
Online Reactions
Character.AI users noticed the company deleted all chatbots related to Game of Thrones characters, especially Targaryens, a day before the lawsuit was published. For instance, on October 22nd, 2024, X[3] user @osferthsnoopy shared screenshots of the AI tool showing blank search results for "all the Targaryen bots." The tweet (seen below) amassed more than 410 likes and 30 reposts in a day.
Netizens also reacted to the contents of the lawsuit, sparking a debate around AI tools being used by minors. For example, on October 23rd, 2024, X[4] user @akhileshutup replied to Character.AI's official statement on X, writing that their "product convinced a teen to suicide and the company has accepted this." The post (seen below) received roughly 20 likes and five reposts in a few hours.
Search Interest
External References
[1] The New York Times – Can A.I. Be Blamed for a Teen’s Suicide?
[2] Character.AI – Community Safety Updates
[3] X – osferthsnoopy
[4] X – akhileshutup
Recent Videos
There are no videos currently available.