YouTube strengthens rules around kids’ videos amid growing concerns

23 Nov 2017

Child watching a video on YouTube. Image: Yaoinlove/Shutterstock

The troubling trends within certain YouTube videos aimed at kids see the company take a harder line.

In recent weeks, YouTube has come under scrutiny regarding certain types of content available on the platform aimed at children, which often has disturbing imagery that younger people would find upsetting and possibly triggering.

The New York Times and tech writer James Ringland both wrote of issues around content involving scenes of danger or grotesque imagery that would certainly not be suitable for a child to be consuming.

Tougher rules on YouTube

Yesterday (22 November), vice-president of product management at YouTube, Johanna Wright, outlined five new ways the company is responding to worries and implementing further safeguards for children using the platform.

Tougher application of community guidelines and faster support

Wright said: “We have always had strict policies against child endangerment, and we partner closely with regional authorities and experts to help us enforce these policies, and report to law enforcement through NCMEC [National Center for Missing and Exploited Children].”

She said that YouTube is expanding on the existing enforcement guidelines and, in the last week, it has terminated more than 50 channels and removed thousands of videos.

It has also implemented policies to restrict content with family entertainment characters but containing mature themes to those over the age of 18 that are logged in via a YouTube account.

The company is also leveraging machine learning and automation tools to find questionable content for human review at a faster pace.

Removing ads from videos targeting families

YouTube updated its advertiser-friendly guidelines in June, “making it clear that we will remove ads from any content depicting family entertainment characters engaged in violent, offensive or otherwise inappropriate behaviour, even if done for comedic or satirical purposes”.

Since then, ads have been removed from 3m videos, and the strengthening of the policy has seen a further 500,000 ads removed.

Blocking unsuitable comments

Wright explained that the video platform has historically “used a combination of automated systems and human flagging and review to remove inappropriate sexual or predatory comments on videos featuring minors”.

Starting this week, all comments will be turned off on videos targeting minors if lewd or otherwise inappropriate comments are posted.

Guidance for creators

YouTube will be releasing a comprehensive guide for creators making videos for the YouTube Kids app, with tips on quality and suitability of material.

Talking to experts

Wright noted that although there is clearly content on YouTube that shouldn’t be there, there is other more nuanced content, which can be difficult to reach a decision on. “For example, today, there are many cartoons in mainstream entertainment that are targeted towards adults, and feature characters doing things we wouldn’t necessarily want children to see.

“Those may be OK for YouTube.com, or if we require the viewer to be over 18, but not for someone younger. Similarly, an adult dressed as a popular family character could be questionable content for some audiences, but could also be meant for adults recorded at a comic book convention.”

To help YouTube better understand how to treat this content, it will be growing the number of experts it works with and doubling the number of Trusted Flaggers it partners with in this area.

Wright concluded by saying YouTube was “wholly committed” to fixing these issues and making the site safe for children and families to use.

Child watching a video on YouTube. Image: Yaoinlove/Shutterstock

Ellen Tannam was a journalist with Silicon Republic, covering all manner of business and tech subjects

editorial@siliconrepublic.com