Facebook reaches $52m settlement with content moderators over PTSD claim

13 May 2020216 Views

Share on FacebookTweet about this on TwitterShare on LinkedInPin on PinterestShare on RedditEmail this to someone

Image: © wolterke/Stock.adobe.com

Share on FacebookTweet about this on TwitterShare on LinkedInPin on PinterestShare on RedditEmail this to someone

Facebook content moderators past and present will receive a minimum of $1,000 after the social network agreed a settlement over PTSD claims.

Facebook has agreed to pay a settlement of $52m to current and former content moderators who said they developed mental health issues from their work on behalf of the social media firm. According to The Verge, a preliminary settlement was filed in a Californian superior court, agreeing to pay the damages brought covering 11,250 moderators and to provide more counselling for its existing staff.

Each of these moderators, past and present, will receive at least $1,000, with the potential to receive additional compensation if they are diagnosed with a mental health condition such as post-traumatic stress disorder (PTSD).

It was reported that this includes an additional $1,500 if they are diagnosed with a mental health condition, while those who receive multiple diagnoses – including PTSD and depression – could be eligible for up to $6,000. For now, the settlement only covers moderators working in the states of California, Arizona, Texas and Florida dating back to 2015.

In a statement, the lawyer for the plaintiffs, Steve Williams, said that Facebook agreed to create an “unprecedented” programme that was “unimaginable even a few years ago”.

“The harm that can be suffered from this work is real and severe,” he added.

How things will change at Facebook

Support Silicon Republic

The terms of the settlement also agree to change practices for moderation of harmful content going forward. Facebook said that audio will be muted by default and videos will be displayed in black and white. It’s expected that this standard will be rolled out to 80pc of Facebook’s moderators by the end of this year and across the board by the end of next year.

Support programmes will also be ramped up, with moderators who have to view some of the site’s most graphic content getting access to one-on-one sessions with a mental health professional. Additionally, those experiencing severe mental health issues as a result of the work will be given access to a counsellor within 24 hours.

The social network issued its own statement, saying the company is grateful to those who “work to make Facebook a safe environment for everyone”.

“We’re committed to providing them additional support through this settlement and in the future,” Facebook said.

Last December, it was revealed that a group of former moderators working on behalf of Facebook in Dublin began legal action against the company in the High Court. The group was seeking damages for personal injuries, stating they were exposed to graphic, sexual and violent content during their employment by third-party outsourcing company CPL in Dublin.

In a standards update released yesterday (12 May), Facebook said it plans to increase automated content moderation, leveraging AI to take action on posts and allowing human moderators to focus on content that requires more nuance and context.

Colm Gorey is a senior journalist with Siliconrepublic.com

editorial@siliconrepublic.com