Your YouTube recommendations are about to see some changes


29 Aug 2019

Image: Nick Ansell/PA

Having seen results from adjustments already implemented in the US, YouTube is furthering trials of changes to its recommendations engine.

YouTube has started testing changes to the way videos are recommended to users in Ireland in order to prevent borderline content and misinformation from spreading, the platform’s chief has revealed.

Adjustments to the algorithm have already resulted in views of this sort of content due to recommendations being cut by half in the US, where it was introduced at the beginning of 2019.

Filtering out false information

The video-sharing website, like many other online tech giants, has grappled with balancing the right to free speech against a backdrop of potentially harmful content, such as questionable miracle cures, flat earth conspiracies and false information about historic events such as 9/11.

In a study published earlier this year, a researcher showed that YouTube’s algorithms will regularly suggest videos filled with false information when searching for topics related to the ongoing climate crisis.

YouTube chief executive Susan Wojcicki said the company is working to “reduce spread of content that brushes right up against our policy line” in a bid to allow quality content “more of a chance to shine”.

Tweaking how recommendations work is now being experimented in a number of English-language markets, including the UK and South Africa.

When announcing the US trial at the end of January, YouTube said it believed “limiting the recommendation of these types of videos will mean a better experience for the YouTube community”.

Improved user experience

In Wojcicki’s quarterly letter published this week, she told creators about her aim to preserve openness on the platform, but admitted it would not be easy.

“It sometimes means leaving up content that is outside the mainstream, controversial or even offensive,” Wojcicki wrote.

“But I believe that hearing a broad range of perspectives ultimately makes us a stronger and more informed society, even if we disagree with some of those views.”

The YouTube boss set out the four standards in the company’s approach towards responsibility: removing violating content, raising authoritative voices, reducing the spread of borderline content, and rewarding trusted creators.

In June, the site began offering users greater insight into why some videos are suggested to them, as well as adding an option to remove a channel suggestion amid concerns about the types of material its algorithm highlights.

– PA Media