Lucy Richards
3 minute read

Earlier this week, TechCrunch tweeted an article relating to YouTube Kids and how inappropriate videos are being watched by children. TechCrunch stated that to combat the issue: “Youtube Kids implements new policy to flag inappropriate videos targeted at children”. This article will discuss what content is filtering through, how this is happening and what is being done to fix this.

YouTube Kids is an app linked to the popular video streaming service, YouTube, which enables children to watch age suitable content. Since it’s launch in 2015, the popularity has grown a tremendous amount as currently there are 8 million weekly active users and 30 billion in-app views (tubefilter, 2017). However, some content creators have decided to create videos that are harmful to children. The tactic they use is to take popular cartoon characters, like Spiderman and Disney Princesses, and have them centered in video with subjects not suitable for children, like sex, drugs and alcohol. The most popular example today is Peppa Pig drinking bleach:



Why are people doing this? One reason could be to corrupt children, but the other is more concrete; non-age-gated content gains revenue from advertising. So, if a video pops up with Elsa from Frozen in it, YouTube will automatically assume it is safe for children to watch. There are now very popular channels dedicated to adults dressing up as cartoon characters and acting out amateur skits. Tubefilter charts (2017), track the most-viewed YouTube channels in the world, and discovered that the most popular channel under this subject to be; Webs & Tiaras, as it gains more than 100 million views every week.

So, what are YouTube doing to fix this problem? YouTube announced in August 2017, that it would no longer allow creators to monetize videos which “made inappropriate use of family friendly characters.”  They also stated that they are “in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged”. They also confirmed that “Age-restricted content is automatically not allowed in YouTube Kids.”  YouTube also added that they will increase their team to have thousands of people, all around the world, working 24/7 dedicated to reviewing videos that have been flagged as inappropriate.

Digital First understands that with new technology comes so much risk, but we applaud YouTube for immediately handling this issue and using clever techniques to stop any further unsuitable videos being seen.


Lucy Richards

Lucy is the Marketing Manager at Digital First, she focuses on social media management, content creation and branding. She previously worked in the investment banking industry for over two years, but decided to pursue her dreams of travel and marketing; and emigrated to Melbourne, Australia. She graduated from the Glasgow Caledonian University in 2014 with a Bachelors degree in Entertainment and Events Management.