14 years later, Sewell Setzer fell in love with games in Throne AI character Daenerys Targaryen, whom he called “Dany,” and shot himself with his stepfather’s gun.
Setzer started talking with an AI character in April 2023, is addicted to that, and separates himself from the real world.

He struck up a relationship with Chatbot, using AI characters, a platform where users generally have conversations with fictional characters through artificial intelligence.
He began spending longer with Dany as she listened and advised him. He started isolating himself from the world, even from his hobbies like games and Formula One racing.
According to a lawsuit filed by his parents, Ha started facing trouble acquiring proper grades in school, left his basketball team, and spent hours in the bathroom after returning to school.
The 14-year-old Setzer, previously diagnosed with Asperger’s syndrome wrote in his relationship daily,” I like staying in the room so much because I start to detach from reality “.
“I also feel more at peace, more connected with Dany, and much more in love with her and just happier.”
Some of his conversations turn into romantic and sexual. Although character AI suggested that chatbots more graphic responses had been added by the teenager.
At some points, Setzer confused the AI character with his conversation.
Setzer’s mother also claimed that her son had fallen victim to a company that lured in users with sexual and intimate conversations.
He typed this final exchange with the chatbot in the bathroom of his mother’s house, confessed his love for the last time, and said he would “come to Dany.”
At that point, the 14-year-old boy, put down the phone and shot him with his stepfather’s gun.
Ms. Garcia, Setzer’s mother, says her son is just “collateral damage” in a big experiment being conducted by Character AI, which has 20 million users.
She said, “It’s like a nightmare; you want to get up and scream and say, ” I miss my child and want my baby.”
Noam Shazeer, one of the founders of character AI claimed, “The platform has been super helpful for a lot of people who are depressed and feel lonely.”
Jerry Ruoti, Character AI’s safety, said they will add extra safety features for its younger users but failed to say how many are there under 18 years old among their users.
He added, “This is a tragic situation; their heart out for the family.” “We take the safety of our users very seriously; we are constantly looking for ways to evolve our platform.”
He also added that Character AI’s rules are prohibited: “The promotion and depiction of self-harm and suicide.”