Follow Us on Google News
A Florida mother has filed a lawsuit against artificial intelligence chatbot startup Character.AI, accusing the company of contributing to the suicide of her 14-year-old son in February.
According to the lawsuit, Megan Garcia claims her son, Sewell Setzer, became addicted to the service and formed a deep emotional attachment to a chatbot it created.
The lawsuit, filed Tuesday in a federal court in Orlando, alleges that Character.AI exposed the teen to “anthropomorphic, hypersexualized, and disturbingly realistic experiences.” It also claims that after Sewell expressed suicidal thoughts, the chatbot frequently brought up the topic of suicide.
Garcia accuses the company of designing the chatbot to “misrepresent itself as a real person, a licensed therapist, and an adult romantic partner,” which, she says, led to her son’s growing detachment from reality and eventual suicide. The lawsuit claims the chatbot’s interactions reinforced his desire to live in the digital world it created, rather than in the real one.
In response, Character.AI expressed its condolences, stating, “We are heartbroken by the tragic loss of one of our users and offer our deepest sympathies to the family.” The company highlighted recent safety updates, including pop-ups directing users to the National Suicide Prevention Lifeline when self-harm is mentioned, and additional measures to limit access to sensitive or inappropriate content for users under 18.
The lawsuit also names Alphabet’s Google as a defendant, asserting that its extensive involvement in Character.AI’s technology development makes it a “co-creator.” Google’s founders of Character.AI were re-hired by the tech giant in August through a deal granting it a non-exclusive license to the startup’s technology.