Facebook Dublin embroiled in scandal over content moderation failures

18 Jul 2018

Facebook international headquarters in Dublin. Image: Luke Maxwell

Channel 4 investigative report into moderation practices at Facebook’s Dublin operation shows failure to remove content containing abuse, violence and hate speech.

Social network Facebook has once again become embroiled in a new scandal, this time with the company’s Dublin operations at its core.

An undercover investigation for Channel 4’s Dispatches revealed systemic failures to remove content flagged as inappropriate or recommended to be removed by users, including graphic images and videos of violent assaults on children.

‘It has been suggested that turning a blind eye to bad content is in our commercial interests. This is not true’
– MONIKA BICKERT

The report’s revelations could see senior Facebook executives hauled before an Oireachtas committee of the Irish Parliament, Dáil Eireann, to explain the claims.

Dispatches sent an undercover reporter to work as a content moderator in Facebook’s largest centre for UK content moderation. That work is outsourced to Dublin company Cpl Resources, which has worked with Facebook since 2010.

The report claimed that thousands of reported posts remain unmoderated despite Facebook’s policy for 24-hour turnaround on complaints. It alleges that moderators were told not to take action or report content regarding a child visibly below the age of 13 (the minimum age for a Facebook user) even if it includes self-harming.

The investigation indicates that pages belonging to far-right groups were allowed to remain on the platform while policies allowing hate speech towards ethnic and religious immigrants were seemingly ignored by moderators.

Dispatches’ report focused on the training given to moderators to decide whether content reported to them by users – such as graphic images and videos of child abuse, self-harming and violence –  should be allowed to remain on the site or be deleted.

One segment includes the undercover reporter moderating a video of two teenage schoolgirls fighting that had been shared more than 1,000 times. The reporter was told that because the video was posted with a caption condemning the violence, it should be left on the site but tagged “disturbing content”.

More clicks, more profits?

One of the most damning claims in the Dispatches report includes allegations from early Facebook investor Roger McNamee that the company’s business model benefits from extreme content because it engages viewers for longer and generates higher advertising revenue.

“From Facebook’s point of view, this is, this is just essentially, you know, the crack cocaine of their product … It’s the really extreme, really dangerous form of content that attracts the most highly engaged people on the platform. Facebook understood that it was desirable to have people spend more time on [the] site. If you’re going to have an advertising-based business, you need them to see the ads so you want them to spend more time on the site. Facebook has learned that the people on the extremes are the really valuable ones because one person on either extreme can often provoke 50 or 100 other people, and so they want as much extreme content as they can get.”

One moderator told the Dispatches undercover reporter: “If you start censoring too much, then people lose interest in the platform … It’s all about making money at the end of the day.”

The revelations in the Channel 4 report were contested by Facebook vice-president of public policy, Richard Allan.

“Shocking content does not make us more money, that’s just a misunderstanding of how the system works … People come to Facebook for a safe, secure experience to share content with their family and friends. The vast majority of those 2bn people would never dream of sharing content that, like that, to shock and offend people. And the vast majority of people don’t want to see it. There is a minority who are prepared to abuse our systems and other internet platforms to share the most offensive kind of material. But I just don’t agree that that is the experience that most people want, and that’s not the experience we’re trying to deliver.”

The report indicates major systemic problems in Facebook’s content moderation and, as well as graphic or violent content, it shows how content related to hate speech and far-right groups with large amounts of followers were allowed to remain on the platform.

For example, English Defence League leader Tommy Robinson, with more than 900,000 followers, has been given the same protected status as governments and news organisations.

In one segment, a moderator is seen telling the undercover reporter that the far-right group Britain First’s pages were left up despite breaching Facebook’s content guidelines because “they have a lot of followers so they’re generating a lot of revenue for Facebook”.

Can things get much worse for Facebook?

The report comes on the heels of the Cambridge Analytica scandal that had reverberations for Facebook around the world. At the heart of that scandal was how data on about 87m people was captured by an app deployed on Facebook that had been created for a political consultancy and may have enabled the targeting of ads to help swing the Brexit referendum in the UK and influence the presidential election in the US in 2016.

The latest revelations from the Channel 4 investigation are damning and worrying for Facebook, undermining its claims that it has been fighting against fake news, hate speech and other harmful content.

Crucially, it suggests a dangerously blithe approach to real human situations.

For Ireland, it puts Facebook’s massive 2,500-strong operation and partners such as CPL in the spotlight.

Earlier this week, we called for a new Magna Carta or a social contract for social media giants to sign up to in order to re-establish trust with the public, who at this point are likely to feel sorely let down.

In response to the revelations, Facebook said it is taking action to update its training practices across all teams, not just CPL.

“This week, a TV report on Channel 4 in the UK has raised important questions about those policies and processes, including guidance given during training sessions in Dublin,” said Monika Bickert, vice-president of global policy management at Facebook.

“It’s clear that some of what is in the programme does not reflect Facebook’s policies or values and falls short of the high standards we expect.

“We take these mistakes incredibly seriously and are grateful to the journalists who brought them to our attention. We have been investigating exactly what happened so we can prevent these issues from happening again. For example, we immediately required all trainers in Dublin to do a retraining session, and are preparing to do the same globally. We also reviewed the policy questions and enforcement actions that the reporter raised, and fixed the mistakes we found.”

Bickert said the claim that Facebook values revenues over morals was wrong.

“It has been suggested that turning a blind eye to bad content is in our commercial interests. This is not true. Creating a safe environment where people from all over the world can share and connect is core to Facebook’s long-term success. If our services aren’t safe, people won’t share and over time would stop using them. Nor do advertisers want their brands associated with disturbing or problematic content.”

If  you thought 2018 could not get any worse for Facebook, well, it looks like it just has. And this time Dublin, rightly or wrongly, is in the spotlight.

John Kennedy is a journalist who served as editor of Silicon Republic for 17 years

editorial@siliconrepublic.com