Content moderation failures have raised the ire of nation’s leaders.
The revelations from a Channel 4 investigative report into moderation practices at Facebook’s Dublin operation have made Ireland’s leaders very uncomfortable and they want answers.
Communications Minister Denis Naughten, TD, will meet with Facebook executives in New York today (19 July) while Taoiseach Leo Varadkar, TD, condemned the revelations as “shocking and unacceptable”.
‘Clearly Facebook has failed to meet the standards the public rightly expects of it’
– MINISTER DENIS NAUGHTEN, TD
The Taoiseach said the Government is examining legislative mechanisms to ensure companies uphold “basic standards of decency”.
Varadkar said that Ireland needs an explanation from Facebook as to why it didn’t uphold its own standards.
An undercover investigation for Channel 4’s Dispatches revealed systemic failures to remove content flagged as inappropriate or recommended to be removed by users, including graphic images and videos of violent assaults on children.
Dispatches sent an undercover reporter to work as a content moderator in Facebook’s largest centre for UK content moderation. That work is outsourced to Dublin company CPL Resources, which has worked with the social network since 2010.
Not only did the reporter discover systemic failures to remove flagged content but it also revealed pages belonging to far-right groups had been allowed to remain, and that hate speech towards ethnic and religious immigrants was ignored by moderators.
For its part, Facebook’s vice-president of global policy management Monica Bickert denied the social network had turned a blind eye to bad content for commercial reasons and said that a retraining regime has been instigated.
“We take these mistakes incredibly seriously and are grateful to the journalists who brought them to our attention,” Bickert said.
In a statement, Minister Naughten said that illegal activity found in content on Facebook cannot simply be ignored by a platform that hosts it and enables it to be shared.
“I am attending the UN High Level Political Forum on Sustainable Development in New York this week. However, I am aware of the contents of the Channel 4 programme and I am deeply concerned,” he said in a statement last night.
“The programme which was broadcast raises serious questions for the company in respect of the manner in which it handles reports of harmful or illegal content carried on its platform; the internal procedures it has in place to moderate harmful or illegal content on its platform; and the systems the company has in place to report instances of abuse, suspected abuse or other illegal activity to the appropriate authorities, including An Garda Síochána.
“Clearly Facebook has failed to meet the standards the public rightly expects of it. I have sought an urgent meeting with Facebook management and this meeting is taking place here in New York on Thursday 19 July.”
Is battling hate speech censorship?
Since the revelations, Facebook has founded itself enmeshed in another related controversy where CEO Mark Zuckerberg added fuel to the fire when he said that Facebook would continue to provide a platform for debate, even to Holocaust deniers.
In an interview with Recode’s Kara Swisher, Zuckerberg – who is Jewish – suggested that Facebook does not censor information but simply won’t promote controversial or incorrect information in its News Feed.
“What we will do is we’ll say, ‘OK, you have your page and, if you’re not trying to organise harm against someone or attacking someone, then you can put up that content on your page even if people might disagree with it or find it offensive.’ But that doesn’t mean that we have a responsibility to make it widely distributed in News Feed.”
The comments come at a time when Facebook is in the picture for how misinformation shared on the platform contributed to violence in Myanmar and Sri Lanka.
Sri Lanka temporarily shut down Facebook earlier this year after hate speech spread on the company’s apps contributed to mob violence.
Since Zuckerberg’s comments went public, Facebook has revealed a new policy that involves reviewing posts that are inaccurate or misleading, or shared with the intent of causing violence or physical harm, including manipulated imagery.
“There are certain forms of misinformation that have contributed to physical harm and we are making a policy change which will enable us to take that type of content down,” the company said in a statement. “We will begin implementing the policy during the coming months.”
As Ireland’s politicians add their voices to a chorus demanding answers, Zuckerberg and his fellow leaders are learning that with great amounts of data and content comes serious responsibility.