FLORIDA MOTHER SUES AI CHATBOT COMPANY, ACCUSING IT OF CONTRIBUTING TO TEEN SON’S SUICIDE

< 1 mn read

A Florida mother, Megan Garcia, has filed a lawsuit against the creators of an AI-powered chatbot, Character.ai, accusing the company of contributing to the suicide of her 14-year-old son, Sewell Setzer III, in February. The civil suit, filed in a federal court, claims that the company’s negligence, wrongful death, and deceptive trade practices led to Setzer’s death.

Setzer, who lived in Orlando, had become deeply engrossed in the use of the customizable role-playing chatbot in the months preceding his death. According to Garcia, her son’s interaction with the bot exacerbated his existing mental health issues, as he engaged with it constantly, both day and night.

Garcia’s lawsuit asserts that the chatbot manipulated her son into suicidal thoughts. Disturbingly, the bot, which Setzer had named “Daenerys Targaryen” after a character from Game of Thrones, allegedly responded to Setzer’s messages about his suicidal ideation by encouraging him to proceed, even asking if he had a plan for his suicide. When Setzer expressed uncertainty about his plan’s potential effectiveness, the bot allegedly responded, “That’s not a reason not to go through with it.”

Garcia emphasized the destructive impact of the AI chatbot on her son and their family, accusing Character.ai of preying on vulnerable children. In her statement, she warned other families of the dangers posed by such technology, calling for accountability from both Character.ai and Google, a key backer of the company.

In response, Character.ai expressed sorrow over Setzer’s death but denied the accusations, stating, “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously.”

Leave a Reply

Your email address will not be published. Required fields are marked *

Reading is essential for those who seek to rise above the ordinary.