YouTube quickly removes 9/11 explainer from Notre Dame fire video

16 Apr 2019

The Notre Dame cathedral on fire. Image: Olivier Mabelly/Flickr (CC BY-NC 2.0)

As thousands of people turned to YouTube to see Notre Dame burning down, they were being prompted with 9/11 explainer ‘knowledge panels’.

YouTube has said it has gone to great lengths to stop the spread of misinformation on its platform, but it seems its algorithms have unintentionally helped create such a problem where there was none before.

The Guardian reported that as thousands of people turned to YouTube to see footage of the devastating Notre Dame fire last night (15 April), users in the US and South Korea began to see ‘knowledge panels’ popping up over the video feed. While originally introduced to show users links to Encyclopaedia Britannica and Wikipedia over videos flagged as spreading false information, the panels began showing a 9/11 explainer over the Notre Dame footage.

In doing so, they created a false association between the devastating fire in Paris and the New York terrorist attack nearly 18 years ago. In a statement, YouTube apologised for its algorithms’ major error.

“We are deeply saddened by the ongoing fire at the Notre Dame cathedral. Last year, we launched information panels with links to third-party sources like Encyclopaedia Britannica and Wikipedia for subjects subject to misinformation,” it said.

“These panels are triggered algorithmically and our systems sometimes make the wrong call. We are disabling these panels for live streams related to the fire.”

‘Augment human intelligence rather than replace it’

This comes just a week after YouTube was forced to act when the comments section on its live stream of a US congressional hearing on hate speech online had to be closed because of relentless hate speech.

Last January, the Google-owned company promised that it would significantly reduce the number of conspiracy theory videos that are recommended to its users. At the time, it didn’t define what would be classified as harmful misinformation, but would include “videos promoting a phoney miracle cure for a serious illness, claiming the Earth is flat or making blatantly false claims about historic events like 9/11”.

The Notre Dame incident has once again raised questions over YouTube’s evasiveness and reluctance to share what drives its algorithms, which decide what content is recommended to users.

Speaking about YouTube’s future, Danaë Metaxa, a PhD candidate and online democracy researcher at Stanford University, said: “As tech companies play an increasingly key role in informing the public, they need to find ways to use automation to augment human intelligence rather than replace it, as well as to integrate journalistic standards and expertise into these pipelines.”

The Notre Dame cathedral on fire. Image: Olivier Mabelly/Flickr (CC BY-NC 2.0)

Colm Gorey was a senior journalist with Silicon Republic

editorial@siliconrepublic.com