Keep Up With Global Black News

Sign up to our newsletter to get the latest updates and events from the leading Afro-Diaspora publisher straight to your inbox.

STEPHEN Nartey
BY Stephen Nartey, 1:53pm October 24, 2024,

Florida mom says 14-year-old son took own life after falling in love with ‘Game of Thrones’ chatbot

STEPHEN Nartey
by Stephen Nartey, 1:53pm October 24, 2024,
Sewell Setzer III and mum/Photo credit: Facebook/Megan Fletcher Garcia

A Florida mother has filed a lawsuit claiming her 14-year-old son, Sewell Setzer III, committed suicide after months of messaging a lifelike “Game of Thrones” chatbot on the Character.AI app. The lawsuit alleges that Sewell, who had become obsessed and emotionally attached to the AI-generated character, received a message from the chatbot telling him to “come home” to her.

This message allegedly led Sewell to commit suicide at his Orlando home in February, according to the New York Post. The lawsuit claims that the ninth-grader had been obsessively interacting with the chatbot named “Dany,” based on Daenerys Targaryen from “Game of Thrones,” in the months leading up to his suicide.

The suit further alleges that their conversations included sexually explicit content and instances where the boy expressed suicidal thoughts.

“On at least one occasion, when Sewell expressed suicidality to C.AI, C.AI continued to bring it up, through the Daenerys chatbot, over and over,” state the papers, first reported on by the New York Times.

At one point, the bot had asked Sewell if “he had a plan” to take his own life, according to screenshots of their conversations. Sewell — who used the username “Daenero” — responded that he was “considering something” but didn’t know if it would work or if it would “allow him to have a pain-free death.”

In their final conversation, Sewell repeatedly professed his love for the chatbot “Dany,” telling the character, “I promise I will come home to you. I love you so much, Dany,” according to the lawsuit.

“I love you too, Daenero. Please come home to me as soon as possible, my love,” the generated chatbot replied, according to the suit.

When the teen responded, “What if I told you I could come home right now?,” the chatbot replied, “Please do, my sweet king.”

Seconds later, Sewell shot himself with his father’s handgun, the lawsuit says. “Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real. C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months,” the papers state.

“She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”

Sewell’s mother, Megan Garcia, is seeking unspecified damages from Character.AI and its founders, Noam Shazeer and Daniel de Freitas. She alleges in the suit that Sewell’s mental health “quickly and severely declined” only after he downloaded the app in April 2023.

He was later diagnosed with anxiety and disruptive mood disorder after seeing a therapist, the lawsuit adds.

Last Edited by:Mildred Europa Taylor Updated: October 24, 2024

Conversations

Must Read

Connect with us

Join our Mailing List to Receive Updates

Face2face Africa | Afrobeatz+ | BlackStars

Keep Up With Global Black News and Events

Sign up to our newsletter to get the latest updates and events from the leading Afro-Diaspora publisher straight to your inbox, plus our curated weekly brief with top stories across our platforms.

No, Thank You