YouTube enlists the help of Wikipedia to stem the tide of conspiracy videos

14 Mar 2018

YouTube app. Image: Jirapong Manustrong/Shutterstock

YouTube will introduce a new information tool to help fight its severe problem with conspiracy videos.

At SXSW 2018, YouTube CEO Susan Wojcicki discussed the problem her platform has with conspiracy and hoax videos gaming its algorithms and taking over the site.

She acknowledged the vast extent of the issue and explained that the platform would be launching a new tool in an effort to fight the spread of disinformation.

A barrage of bad press

YouTube has faced scrutiny for a litany of negative events in recent months, from the Logan Paul controversy to the proliferation of conspiracy videos around the school shooting in Parkland, Florida.

Videos propagating conspiracy theories about events such as the moon landing will now be accompanied by text from Wikipedia providing facts that counter the viewpoint of the video.

Wojcicki said YouTube’s goal is to “to start with a list of conspiracies around the internet where there’s a lot of active discussion”.

Not a perfect solution

The plan is not without its flaws. As The Guardian reported, while Wikipedia is a good resource for refuting events that have occurred in the past, it is not ideally equipped to deal with breaking news events such as the Parkland shooting, as it says itself: “Wikipedia is an encyclopaedia, not a newspaper … as a result, our processes and principles are designed to work well with the usually contemplative process of building an encyclopaedia, not sorting out the oft-conflicting and mistaken reporting common during disaster and other breaking news events.”

While YouTube did say that Wikipedia could be one of many third-party resources for the new tool, this still shows a company that is not willing to engage on a more meaningful level with the dangerous content it profits from.

Limits for moderators

YouTube recently said it would be hiring thousands of new moderators to monitor the platform for content that violates its guidelines.

At SXSW, Wojcicki announced that these part-time moderators would be limited to viewing disturbing content for four hours per day. She said: “This is a real issue and I myself have spent a lot of time looking at this content over the past year. It is really hard.”

In the US, federal laws that absolve tech firms for hosting illegal content still require them to remove the content from the platform. YouTube uses a content moderation system called Content ID and enlists human moderators for more serious violations, including depictions of murder or suicide. Often, the people hired by YouTube are contractors and therefore have not got the same access to mental health assistance as full-time Google employees.

Between the sheer volume of content and the growing psychological burden on moderators, this issue will only intensify as time goes on.

YouTube app. Image: Jirapong Manustrong/Shutterstock

Ellen Tannam was a journalist with Silicon Republic, covering all manner of business and tech subjects

editorial@siliconrepublic.com