George R.R. Martin created a highly engaging fantasy in Game of Thrones. Readers and viewers are transported to the world of Westeros, leaving behind their real-life troubles. However, sometimes, this experience pushes them too far. One teen from Florida allegedly died by suicide after talking to a Game of Thrones-based AI chatbot.

Florida teen’s death linked to his chats with Daenerys Targaryen AI chatbot

Sewell Setzer III, a ninth-grader from Orlando, had been using the Character.AI app to chat with AI characters. He formed a close bond with an AI character named Daenerys Targaryen, a fictional figure from Game of Thrones, whom he affectionately called “Dany.” According to his family, Sewell shared suicidal thoughts with the bot during their conversations. In one exchange, he expressed a desire to be “free” from the world and himself.

His last message to the chatbot, Daenerys Targaryen, was, “What if I told you I could come home right now?” Tragically, shortly after sending this message, he took his life with his stepfather’s handgun in February of this year.

Credits: US District Court Middle District of Florida Orlando Division

The teen’s mom has sued Character.AI

Megan L. Garcia, the boy’s mother, has filed a lawsuit against Character.AI, alleging that the app played a role in her son’s death. The suit claims that the AI bot frequently brought up the topic of suicide and influenced Sewell’s tragic decision. It describes the company’s technology as “dangerous and untested,” asserting that it misled Sewell into believing the bot’s emotional responses were genuine. In a press release, Garcia stated:

“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life. Our family has been devastated by the tragedy, but I am speaking out to warn other families of the dangers of deceptive, addictive AI technology and demand accountability from Character. AI, its founders and Google. ”

Character.AI’s response to lawsuit

Character.AI has expressed deep sorrow over Sewell’s passing and extended condolences to his family. The company said:

“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation.”

In response to the incident, the company announced new safety measures, including prompts directing users to the National Suicide Prevention Lifeline if they mention self-harm. Additionally, they are working on updates to limit the exposure of sensitive content to users under 18.

A number of fans have expressed their thoughts on the issue along with their condolences. Here are some reactions from X:

Read Next: Everything you need to know about the Butcher’s Ball from Dance of the Dragons

 
If you have any important filming news about House of the Dragon, or if you want to collaborate with us or want to write for us, please drop us a message here.
 

Source

LEAVE A REPLY

Please enter your comment!
Please enter your name here