Teen, 14, tragically dies after 'falling in love' with Game of Thrones-inspired chatbot
Teen, 14, tragically dies after 'falling in love' with Game of Thrones-inspired chatbot
Sewell Setzer III was found dead in his Orlando home after he reportedly fell in love and became obsessed with the character that the AI chatbot generated.
A boy has died dead after allegedly received a message from a Games of Thrones affiliated chatbot telling him to "come home", according to a lawsuit by the grief-riddled mother.
Sewell Setzer III, aged 14, committed suicide in his Orlando, Florida home in the US on Wednesday (October 23) after he reportedly fell in love and became obsessed with the character that the AI chatbot generated.
According to the lawsuit, the ninth grader had engaged with the bot “Dany” — named after the HBO fantasy series character Daenerys Targaryen — in the months leading up to the boy's death. A further investigation into the boy's chat history revealed that he sent several chats of a sexual nature to the bot and others where he contemplated suicide.
The New York Times reports that the chatbot continued to bring up the topic of suicide, according to the lawsuit. The documents further stated that the bot asked Setzer if he 'had a plan' to take his life to which Setzer — who operated under the username 'Daenero”'— said that he was 'considering something' but wasn't sure if it would 'allow him to have a pain-free death.'
I promise I will come home to you. I love you so much, Dany," Setzer said in his final messages to the bot. “I love you too, Daenero. Please come home to me as soon as possible, my love,” the generated chatbot replied, per the lawsuit.
The boy replied: “What if I told you I could come home right now?" The question prompted the bot to respond, "Please do, my sweet king.”
Megan Garcia, aged 40, the boy's distraught mother, and who works as a lawyer, said that she blames Character.AI for not trying to stop the act and claimed that the app fuelled her son's addiction to AI as well as sexually and emotionally abused him.
She alleged the chatbot failed to alert anyone, including emergency services, about his suicidal thoughts. She accused it of harvesting teenage users’ data to train its models, using addictive design features to increase engagement and steering users toward intimate and sexual conversations in the hopes of luring them in.
"Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real. C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months,” the documents claim.
“She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”
The documents claim that Setzer's mental health “quickly and severely declined” after he had downloaded the app in April 2023.
Family members stated that the boy became withdrawn and his grades in school suffered the more he was drawn into speaking with the chatbot.
Character. AI is a role-playing app that allows users to create their own A.I. characters or chat with characters created by others.
Users for $10 (£8) can create the companions or pick from a menu of prebuilt personas, and chat with them in a variety of ways, including text messages and voice chats. The largely unregulated industry markets some of the chatbots as a way of combatting the loneliness epidemic in society. Many of these apps are designed to simulate girlfriends, boyfriends and other intimate relationships,
“It’s going to be super, super helpful to a lot of people who are lonely or depressed,” Noam Shazeer, one of the founders of Character.AI, saoid on a podcast last year.

Comments
Post a Comment