We need a social media watchdog with teeth

23 Jul 2018329 Views

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Image: Falcona/Shutterstock

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

The need for a digital safety commissioner is more crystal clear now than ever, writes John Kennedy.

There was a point during last week’s undercover investigation for Channel 4’s Dispatches where I had to reef my headphones from my head because I just could not listen.

The sound of a child crying in distress while being savagely beaten by an adult was sickening and would haunt my mind for days after.

What was just as sickening was the revelation that the video had been posted on social media platform Facebook six years ago and, despite countless protests and complaints, it was left online and simply tagged ‘disturbing content’.

Yes, the perpetrator was eventually tracked down and arrested, but why was the video still left online?

Also aggravating for me was, despite the voices being masked, you could make out an Irish accent. “If you start censoring too much, then people lose interest in the platform … It’s all about making money at the end of the day.”

It was uncomfortable viewing for anyone with a conscience and, even more uncomfortable for Ireland, it raised questions about our unofficial status as the Silicon Valley of Europe in terms of the presence of so many digital giants and their work here.

The investigation saw a Channel 4 reporter go undercover and work as a contractor with Dublin company Cpl Resources, which has worked with Facebook since 2010.

The report claimed that thousands of reported posts remain unmoderated despite Facebook’s policy for 24-hour turnaround on complaints. It alleged that moderators were told not to take action or report content regarding a child visibly below the age of 13 (the minimum age for a Facebook user) even if it includes self-harming. It also alleged that pages belonging to far-right groups were allowed to remain on the platform while policies allowing hate speech towards ethnic and religious immigrants were seemingly ignored by moderators.

The revelations in the Channel 4 report were contested by Facebook vice-president of public policy, Richard Allan. “Shocking content does not make us more money, that’s just a misunderstanding of how the system works … People come to Facebook for a safe, secure experience to share content with their family and friends.

“The vast majority of those 2bn people would never dream of sharing content that, like that, to shock and offend people. And the vast majority of people don’t want to see it.”

So then, why leave it up?

For its part, Facebook said that it was taking these mistakes seriously and immediately began retraining the moderators.

Moral responsibility in these digital times

But the genie was out of the bottle, with An Taoiseach Leo Varadkar, TD, condemning the revelations as “shocking and unacceptable” and asking why Facebook didn’t uphold its own standards. Communications Minister Denis Naughten, TD, sought an urgent meeting with Facebook during a whistle-stop trip to New York last week.

The whole saga occurred during a week where even Facebook CEO and co-founder Mark Zuckerberg found himself in a related debate over the social network’s practices around censorship and why some content that is offensive to many isn’t removed, even content posted by Holocaust deniers.

In an interview with Recode’s Kara Swisher, Zuckerberg – who is Jewish – suggested that Facebook does not censor information but simply won’t promote controversial or incorrect information in its news feed.

Facebook, which is licking its wounds and trying to shut the barn door after the Cambridge Analytica affair involving the use of 87m users’ data, is coming across as a company very much at war with itself.

On the one hand, it wants to connect the world. On the other hand, it is a 14-year-old company still coming to terms with what it is and what it stands for.

It is a community of 2bn people, but even a tiny fraction of 2bn people can still be up to no good.

Facebook doesn’t want to be a police officer or a censor. It has claimed not to be a publisher but makes its revenues from advertising. Whatever it is, it has a responsibility for decency.

This morning (23 July), we reported how Facebook is taking action to investigate the practices of another data firm called Crimson Hexagon regarding alleged practices in gathering data and links with US government agencies, the Turkish government and possibly the Kremlin through a non-profit.

Facebook appears to want to be the good guy but it is seemingly struggling to understand its own complexities and rationale. It is slowly coming to terms with the real-world impact its technology is having.

For example, Sri Lanka recently temporarily shut down Facebook after hate speech spread on the company’s apps such as WhatsApp, contributing to mob violence. WhatsApp has also had to be reconfigured in India for similar reasons.

But back to Ireland. This country has been and will continue to be a good place for so many digital giants to do business. However, we need to shift position from being grateful for all the jobs that have been created to also realising we hold a moral responsibility in all of this.

In many ways, just as the Data Protection Commissioner unofficially became the data commissioner for the rest of Europe by virtue of so many tech behemoths here, we need to realise that many of the practices around data and content are happening on our watch. For that reason, Ireland needs to set the world standard for decency.

If data is the new oil, then Ireland is home to the refineries. And with that amount of data comes serious responsibility.

It is puzzling therefore that the country has so far failed to appoint a digital safety commissioner to handle issues that the digital world has presented, including bullying, online harassment, hate speech and how speedily companies handle complaints over content posted on their platforms.

It emerged recently that Minister Naughten rowed back on a commitment to appoint a digital safety commissioner, claiming that the process was “far more complex” than he expected.

For some reason, the Government’s Action Plan for Online Safety, which was published in recent weeks, removed the part about creating a digital safety commissioner ostensibly because it required more attention and groundwork to be laid.

True, such a role requires careful consideration and attention. However, if the revelations of the past week prove anything, it is that we cannot shilly-shally on this.

Ireland needs to show leadership and establish some kind of social media watchdog with teeth and without delay.

The world is watching.

Want stories like this and more direct to your inbox? Sign up for Tech Trends, Silicon Republic’s weekly digest of need-to-know tech news.

Editor John Kennedy is an award-winning technology journalist.

editorial@siliconrepublic.com