The guidelines for removal of content on Facebook have been revealed, and they may be more lenient than you think.
Facebook is currently facing major incidents taking place on its platform on a monthly – if not weekly – basis. These run the gamut from trolling to suicides conducted on Facebook Live.
In response, the social media giant has promised to do more to tackle incidents as they happen among its 1bn-plus users, by recruiting more people to react as quickly as possible.
This was as much as anyone knew until The Guardian’s recent leak of more than 100 documents given to Facebook employees.
The documents detail how content-control staff should react in the face of hate speech, terrorism and attempts at suicide, outlining instances where it is acceptable to keep distressing content online.
For example, Facebook said that live streams of self-harm will not be taken down as the company “doesn’t want to censor or punish people in distress”.
In a similar stance, videos uploaded (or recorded live) showing violent deaths will likely remain in place, as Facebook believes it will help raise awareness of people with mental illness.
Dublin office cited as example
Deciding what is and isn’t free speech remains one of Facebook’s biggest challenges as it attempts to distance itself from being a platform of ‘fake news’, something which it eventually took responsibility for in the wake of founder Mark Zuckerberg’s dismissive comments.
According to its slideshows, Facebook told staff that any threats made against heads of state, such as US president Donald Trump, will be taken down as soon as possible.
Cited examples worthy of deletion include: “I’ll destroy the Facebook Dublin office” and “someone shoot Trump”. However, the likes of “kick a person with red hair”, or “let’s beat up fat kids”, can stay.
Facebook’s head of global policy management, Monika Bickert, said it was always going to be difficult to create standards when things aren’t necessarily black and white.
“No matter where you draw the line, there are always going to be some grey areas. For instance, the line between satire and humour and inappropriate content is sometimes very grey. It is very difficult to decide whether some things belong on the site or not.”