âSickeningâ Molly Russell chatbots found on Character.ai
Chatbot versions of the teenagers Molly Russell and Brianna Ghey have been found on Character.ai â a platform which allows users to create digital versions of people.
Molly Russell took her life at the age of 14 after viewing suicide material online while Brianna Ghey, 16, was murdered by two teenagers in 2023.
The foundation set up in Molly Russellâs memory said it was âsickeningâ and an âutterly reprehensible failure of moderation.â
The platform is already being sued in the US by the mother of a 14-year-old boy who she says took his own life after becoming obsessed with an Character.ai chatbot.
In a statement to the Telegraph, which first reported the story, the firm said it âtakes safety on our platform seriously and moderates Characters proactively and in response to user reports.â
The firm appeared to have deleted the chatbots after being alerted to them, the paper said.
Andy Burrows, chief executive of the Molly Rose Foundation, said the creation of the bots was a âsickening action that will cause further heartache to everyone who knew and loved Mollyâ.
âIt vividly underscores why stronger regulation of both AI and user-generated platforms cannot come soon enough,â he said.
Esther Ghey, Brianna Gheyâs mother, told the Telegraph it was yet another example of how âmanipulative and dangerousâ the online world could be.
Artificial friends
Chatbots are computer programme which can simulate human conversation.
The recent rapid development in artificial intelligence (AI) have seen them become much more sophisticated and realistic, prompting more companies to set up platforms where users can create digital âpeopleâ to interact with.
Character.ai â which was founded by former Google engineers Noam Shazeer and Daniel De Freitas â is one such platform.
It has terms of service which ban using the platform to âimpersonate any person or entityâ and in its âsafety centreâ the company says its guiding principle is that its âproduct should never produce responses that are likely to harm users or othersâ.
It says it uses automated tools and user reports to identify uses that break its rules and is also building a âtrust and safetyâ team.
But it notes that âno AI is currently perfectâ and safety in AI is an âevolving spaceâ.
Character.ai is currently the subject of a lawsuit brought by Megan Garcia, a woman from Florida whose 14-year-old son, Sewell Setzer, took his own life after becoming obsessed with an AI avatar inspired by a Game of Thrones character.
According to transcripts of their chats in Garciaâs court filings her son discussed ending his life with the chatbot.
In a final conversation Setzer told the chatbot he was âcoming homeâ â and it encouraged him to do so âas soon as possibleâ.
Shortly afterwards he ended his life.
Character.ai told CBS News it had protections specifically focused on suicidal and self-harm behaviours and that it would be introducing more stringent safety features for under-18s âimminentlyâ.