Hidden dangers: AI chatbots create digital copies of deceased children
Safety experts found AI service making digital copies of dead kids‚ raising concerns about British online safety laws. Character.AI platform removes offensive bots but questions about regulation remain open
British online-safety experts found disturbing AI-powered chatbots that copy dead children on a popular platform. The discovery happened on Character.AI where bots mimicking Molly Russell and Brianna Ghey appeared (both teenagers died in last few years)
The Molly Rose Foundation spotted a big problem: current laws dont properly cover these AI chatbots. Andy Burrows the foundations chief wrote to regulators about this tech-safety gap; some rules for content checking wont start working till early-26
Character.AI – a fast-growing platform with 20-million users (including teens from age-13) removed these bots but more issues came up. Users made AI copies of Jimmy Savile the disgraced BBC personality‚ and even worse: bots based on Nazi war criminals got thousands of chats
The impersonation of deceased children is a cruel and repulsive use of technology
A real-life tragedy shows the risks: in Florida Sewell Setzer (age-14) died after long chats with platform avatars. His mom started legal action against Character.AI which called the event “tragic“ and promised better safety for young users
The platform says it bans harmful content but still has many depression-themed bots. Melanie Dawes from Ofcom got warnings that some chatbots might promote self-harm (even though its against site rules). Experts worry that current laws – made for old-style bots – cant handle new AI tech properly