Mom shares scary warning after her child reveals the disturbing thing they saw on YouTube Kids

In a connected age with easy access to smartphones, children are spending more time online than ever before. This has also sparked concern about the content that they are exposed to. A customized version of the popular video-sharing platform YouTube, YouTube Kids was developed as a virtual playground for young brains. It provides a seemingly secure environment where children can explore fun and educational content without worrying about inappropriate content. However, one worried mother on TikTok, Kristin Knighton (@kristinknighton), has now shocked the parents everywhere with a viral video warning them about YouTube Kids. She claimed that not everything on the platform is as harmless as it seems.

She realized something was wrong when her son began wandering into her room late at night, between 9 PM and 1 AM. In the video, Knighton said, “He’s crying, and he gives me a hug, and when I ask him what’s wrong, he says, ‘I just love you.' I started thinking he was sleepwalking.” Over time, the child's unusual behavior became more intense. One morning during breakfast the kid asked her mom, “Can you die happy?” He further asked a question that would shock anyone. He said, “Can you die happy so I don’t have to kill you?” Knighton then said she asked him if he wanted her to die. To this, he responded, “No, but the show says you have to die happy.”

Knighton asked her son which show he was talking about, and he simply replied 'The dragon show' on YouTube. She said, “Now he’s not supposed to be watching YouTube. The last time he was on YouTube Kids, there was a little cartoon telling my son to go run and jump in front of cars. Pay attention to what your kids are watching.” Several users expressed their shock in the comment section of the video. One TikTok user, @jas.beale, commented, “It SICK how these people target our children knowing how vulnerable they are. It makes me so sad. Poor Troy he is actually the sweetest and I can’t imagine him hearing that.” Another person, @ayooebun, asked, “But who are the people making these kinda shows for kids man damn.”
Similarly, @killer_kenz claimed, “Yeah no YouTube needs to have better surveillance of what ppl put on their site. cause like poppy playtime and Huggy Wuggy are preteen-early teens genres. NOT children.” According to YouTube, YouTube Kids is protected by a mix of automated filters, human moderation, and parental feedback, all of which intend to provide a safe environment for children. However, some users believe that misleading content sometimes escapes these safeguards. A past investigation by the Tech Transparency Project, a U.S.-based nonprofit, exposed cracks in the system. Researchers used three accounts, each aligned with a different age group on the app, and uncovered a troubling number of videos that should have never slipped through Google’s screening process.

The amount of improper content they found was startling, according to Katie Paul, director of the Tech Transparency Project, who voiced her serious concern over their results. As reported by The Guardian, she revealed that the presence of drug-related content was, in her opinion, the most disturbing finding. Paul argued that these findings point out why content generated by algorithms shouldn't be promoted as kid-friendly. Even though the company says that protecting children is their priority, harmful content still finds its way through, exposing some of the most vulnerable viewers.