Nestlé and Disney pull ads from YouTube over child exploitation claims

21 Feb 2019278 Views

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

YouTube homepage. Image: bloomua/Depositphotos 

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Several major companies pulled their ad spending from YouTube following a discovery that predators are using the site to exploit children.

Firms such as Disney, Nestlé and Fortnite maker Epic Games have paused their advertising campaigns on YouTube after video creator Matt Watson posted a clip detailing how comments on the site are being used to facilitate a “soft-core paedophile ring”.

Moderation issues

According to Bloomberg, all Nestlé firms in the US paused advertising on the site, along with Disney and German food firm Dr Oetker. Watson posted the video on 17 February, which detailed how comments on YouTube were used to locate certain videos in which young girls were taking part in activities that could be construed as suggestive, such as gymnastics.

He noted that YouTube’s recommendation algorithms recommended a string of similar videos of young children if a user clicked on one of them.

Commenters on the video would flag the videos, creating an ad-hoc community of people leaving paedophilic comments underneath the content. People also traded contact details in the comments, as well as links to child abuse imagery hosted on other sites.

A wormhole of exploitation

Watson described YouTube’s algorithm that dredged up similar videos as a “wormhole of exploitative content”. Wired was able to verify the claims made by Watson in his video. Multiple Reddit users also highlighted the issues he outlined.

A spokesperson for YouTube said: “Any content – including comments – that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube.

“We took immediate action by deleting accounts and channels, reporting illegal activity to authorities, and disabling comments on tens of millions of videos that include minors. There’s more to be done, and we continue to work to improve and catch abuse more quickly.”

The platform also released an updated policy about how it would deal with content that “crosses the line” of what it deems appropriate to host.

Not a new problem

This moderation problem is far from a new issue for YouTube. In 2017, many advertisers pulled ads from the platform after their content was being displayed against videos being exploited by paedophiles.

At the time, a spokesperson for food firm and YouTube advertiser Mars said: “We are shocked and appalled to see that our adverts have appeared alongside such exploitative and inappropriate content.

“We have taken the decision to immediately suspend all our online advertising on YouTube and Google globally. Until we have confidence that appropriate safeguards are in place, we will not advertise on YouTube and Google.”

In November of 2017, YouTube announced it would be rolling out a series of new enforcement changes to deal with its moderation concerns, but it is clear the issue is still pervasive on the site.

YouTube homepage. Image: bloomua/Depositphotos 

Ellen Tannam is a writer covering all manner of business and tech subjects

editorial@siliconrepublic.com