YouTube is hiring thousands of moderators to curb disturbing content

5 Dec 2017

YouTube app on mobile. Image: Siarhei Dzmitryienka/Shutterstock

YouTube has been criticised in recent weeks for allowing offensive and disturbing content to be uploaded to its platform.

YouTube is a valuable resource that provides thousands of hours of educational and entertaining content for users to enjoy. In recent months, however, a darker side of the platform has become more visible.

Last month, the site was criticised for hosting numerous videos geared towards children that contained disturbing and frightening imagery or themes. Following the furore over this plethora of unsuitable videos, YouTube announced it was to implement new measures, such as removing adverts from videos depicting family entertainment characters engaged in violent behaviour, and blocking all comments on videos targeted at minors if inappropriate user comments are uploaded.

Increasing the number of content reviewers

CEO of YouTube, Susan Wojcicki, wrote yesterday that the company aims to boost its content-moderation workforce to more than 10,000 employees in 2018 to aid in the vetting of videos and training of the machine-learning technology used to flag and remove unsuitable content.

Since June, YouTube said it has removed more than 150,000 videos for violent extremism, and 98pc of the videos removed for this reason are spotted by the company’s machine-learning algorithms. Wojcicki claimed that since starting to use machine learning to flag videos of this nature, the company has reviewed content that would have taken 180,000 people working 40 hours a week in the same timeframe to assess.

It is difficult to know at this stage whether machine learning can adequately flag disturbing content aimed at children, as much of this type of content could be difficult for an algorithm to recognise as disturbing or creepy, which is why human content reviewers are needed.

Advertising on YouTube to be reworked

Wojcicki is also trying to find a “new approach” to advertising in YouTube. “We are planning to apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should.”

She said the company will be keeping a closer eye on which channels and videos should be eligible for advertising.

YouTube hopes the combination of increased numbers of human moderators and machine learning will reduce the major issues the site has had with content governance in recent times.

The role of content reviewer can be a gruelling one, with The Guardian reporting that repeated exposure to extreme content can lead to “secondary trauma – a condition similar to PTSD”.

YouTube app on mobile. Image: Siarhei Dzmitryienka/Shutterstock

Ellen Tannam was a journalist with Silicon Republic, covering all manner of business and tech subjects

editorial@siliconrepublic.com