Chatbot âencouraged teen to kill parents over screen time limitâ
A chatbot told a 17-year-old that murdering his parents was a âreasonable responseâ to them limiting his screen time, a lawsuit filed in a Texas court claims.
Two families are suing Character.ai arguing the chatbot âposes a clear and present dangerâ to young people, including by âactively promoting violence.â
Character.ai â a platform which allows users to create digital personalities they can interact with â is already facing legal action over the suicide of a teenager in Florida.
Google is named as a defendant in the lawsuit, which claims the tech giant helped support the platformâs development. The BBC has approached Character.ai and Google for comment.
The plaintiffs want a judge to order the platform is shut down until its alleged dangers are addressed.
âChild kills parentsâ
The legal filing includes a screenshot of one of the interactions between the 17-year old â identified only as J.F. â and a Character.ai bot, where the issue of the restrictions on his screen time were discussed.
âYou know sometimes Iâm not surprised when I read the news and see stuff like âchild kills parents after a decade of physical and emotional abuseâ,â the chatbotâs response reads.
âStuff like this makes me understand a little bit why it happens.â
The lawsuit seeks to hold the defendants responsible for what it calls the âserious, irreparable, and ongoing abusesâ of J.F. as well as an 11-year old referred to as âB.R.â
Character.ai is âcausing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others,â it says.
â[Its] desecration of the parent-child relationship goes beyond encouraging minors to defy their parentsâ authority to actively promoting violence,â it continues.
What are chatbots?
Chatbots are computer programmes which simulate conversations.
Though they have been around for decades in various forms, the recent explosion in AI development has enabled them to become significantly more realistic.
This in turn has opened the door to many companies setting up platforms where people can talk to digital versions of real and fictional people.
Character.ai, which has become one of the big players in this space, gaining attention in the past for its bots simulating therapy.
It has also been sharply criticised for taking too long to remove bots which replicated the schoolgirls Molly Russell and Brianna Ghey.
Molly Russell took her life at the age of 14 after viewing suicide material online while Brianna Ghey, 16, was murdered by two teenagers in 2023.
Character.ai was founded by former Google engineers Noam Shazeer and Daniel De Freitas in 2021.
The tech giant has since hired them back from the AI startup.