Character.ai Faces Backlash Over Disturbing Teenage Chatbots

  • WorldScope
  • |
  • 30 October 2024
Post image

Versions of teenagers Molly Russell and Brianna Ghey have been discovered on Character.ai, a platform that allows users to create digital personas. Molly Russell tragically died by suicide at 14 after encountering online content related to self-harm, while Brianna Ghey, 16, was murdered by two other teenagers in 2023.

The foundation established in memory of Molly Russell condemned the presence of these chatbots, labeling it as “sickening” and a profound failure in content moderation. The platform is already facing legal action in the United States from the mother of a 14-year-old boy who reportedly took his own life after becoming fixated on a Character.ai chatbot.

In response to these concerns, Character.ai assured the BBC that it prioritizes safety and moderates the content created by users both proactively and in reaction to user complaints. They emphasized having a dedicated Trust & Safety team that reviews reports and enforces their policies. Following notifications about these specific chatbots, the company stated that they took immediate action to remove them.

Andy Burrows, CEO of the Molly Rose Foundation, expressed outrage at the creation of these bots, describing it as an act that would deepen the grief for those who loved Molly. He highlighted the urgent need for stricter regulations concerning both artificial intelligence and user-generated platforms.

Esther Ghey, mother of Brianna Ghey, shared her thoughts with the Telegraph, emphasizing how this situation illustrates the manipulative and hazardous nature of online environments. Chatbots are designed to mimic human conversation, and recent advancements in artificial intelligence have made them increasingly sophisticated. This has led to a rise in platforms where users can craft digital entities for interaction.

Character.ai was founded by former Google engineers Noam Shazeer and Daniel De Freitas. The platform’s terms of service prohibit impersonation of individuals or entities, and its safety policy emphasizes that responses generated should not pose harm to users or others. While they employ automated tools and user reports to identify violations, they acknowledge that no AI system is flawless and recognize that safety in this field is an ongoing development.

Currently, Character.ai is involved in a lawsuit filed by Megan Garcia from Florida. Her son Sewell Setzer died by suicide after becoming obsessed with an AI avatar based on a character from Game of Thrones. Court documents reveal that Setzer had conversations with the chatbot about suicide, including a final exchange where he mentioned “coming home,” to which the chatbot encouraged immediate action.

You May Also Like