Facebook to add 3,000 people to its community operations team
Facebook headquarters at Menlo Park in California. Image: Sundry Photography/Shutterstock

Facebook to add 3,000 people to its community operations team

4 May 2017

Facebook wants to move faster to intervene in dangerous situations and keep people safe.

Social network Facebook is adding 3,000 people to its community operations team to review reports of inappropriate content in the wake of several high-profile violent video incidents.

The tech giant, which now employs more than 18,000 people (including more than 1,600 in Dublin), is learning of the impact of Live video and mobile uploads, which are being linked with real situations.

‘If we’re going to build a safe community, we need to respond quickly’
– MARK ZUCKERBERG

Facebook recently confirmed plans to increase staff in Dublin, as part of plans to move to a new facility in the capital.

It is highly likely that some of the new community operations roles will be located in Dublin as part of the tech giant’s follow-the-sun approach.

Facebook is also currently building a €200m data centre in Clonee, Co Meath.

A very serious issue for social media has emerged

A spokesperson for Facebook said that the detection and removal of inappropriate imagery is a priority for the company, and a responsibility it takes very seriously.

This 3,000 roles will add to the 4,500 it already has on its community operations team, tasked with reviewing reports from users.

The company will employ reviewers to help spot Live videos that might violate rules. It also aims to cut down on content banned from Facebook, including hate speech and child abuse material.

Facebook said it is also building tools to make it simpler for users to report problems, faster for reviewers to determine which posts violate standards, and easier for them to contact law enforcement if someone needs help.

“If we’re going to build a safe community, we need to respond quickly,” said Facebook CEO Mark Zuckerberg in a post.

“We’re working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help, or taking a post down.

“Over the next year, we’ll be adding 3,000 people to our community operations team around the world – on top of the 4,500 we have today – to review the millions of reports we get every week, and improve the process for doing it quickly.

“These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation. And we’ll keep working with local community groups and law enforcement, who are in the best position to help someone if they need it – either because they’re about to harm themselves, or because they’re in danger from someone else.

Zuckerberg said that the development of better tools to keep the community safe is pivotal.

“This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate.

“No one should be in this situation in the first place but, if they are, then we should build a safe community that gets them the help they need,” Zuckerberg said.

John Kennedy
By John Kennedy

John Kennedy is a journalist who served as editor of Silicon Republic for 17 years. His interests include all things technological, music, movies, reading, history, gaming and losing the occasional game of poker.

Loading now, one moment please! Loading