Humans on the frontline as Facebook attempts to tackle revenge porn

5 Apr 2017

Facebook ‘likes’. Image: Gil C/Shutterstock

Facebook is to introduce a series of software tools to tackle the problem of revenge porn, but humans will still decide what is and isn’t acceptable.

When it comes to serious issues on social media – and particularly Facebook and Instagram – few are as damaging as someone posting intimate images of a person without their consent, often referred to as revenge porn.

In the US alone, a survey found that 82pc of people who were victims of revenge porn said they endured “significant impairment in social, occupational or other important areas of their life”.

However, with more than a billion users on Facebook – and around 600m users on Instagram – addressing the problem on a global scale can be rather challenging.

Last February, Facebook founder Mark Zuckerberg wrote an open letter admitting that social media today was effectively broken, and that serious efforts to make it more inclusive and safe was more of a priority than ever.

So, in that vein, Facebook has revealed what it plans to do to stem the tide of revenge porn, if not eradicate it completely.

New photo-matching algorithms

However, on the face of it, little would seem to be different from what we had before.

Under the new system, a person may report photos that they believe qualify as revenge porn. Then, Facebook representatives from its Community Operations team remove it if they agree.

At this point, the company’s new software kicks in, using photo-matching algorithms to remember the reported image and prevent it from being shared on Facebook, Messenger or Instagram.

“If someone tries to share the image after it has been reported and removed, we will alert them that it violates our policies and that we have stopped their attempt to share it,” Facebook explained.

Previous complaints aimed at the social media giant regarding its methods of tackling revenge porn largely revolved around its lack of reporting structure within closed groups, where people share images of their ex-partner without their consent.

While a victim may have been aware of their image being shared, they could not report it unless they were a member of said group.

This latest effort is one of the first by Facebook to allow for greater reporting of inappropriate content on its platform.

Facebook ‘likes’. Image: Gil C/Shutterstock

Colm Gorey was a senior journalist with Silicon Republic

editorial@siliconrepublic.com