Meta urged to pay for creating ‘echo chamber of hatred’ towards Rohingya

29 Sep 2022

Image: © rexandpan/Stock.adobe.com

The company has been called on to pay reparations for the role its ‘hate-spiralling algorithms’ played in spreading anti-Rohingya sentiments.

Amnesty International has criticised Facebook for “intensifying a storm of hatred against the Rohingya which contributed to real-world violence”, and called on parent company Meta to pay reparations to the displaced community.

Agnès Callamard, secretary general of Amnesty International, said that Facebook’s algorithms exacerbated anti-Rohingya sentiments among the Myanmar population in the months and years leading up to a series of atrocities committed against the community.

“In 2017, the Rohingya were killed, tortured, raped and displaced in the thousands as part of the Myanmar security forces’ campaign of ethnic cleansing,” she said.

The Rohingya are a predominantly Muslim ethnic minority based in Myanmar’s northern Rakhine State. Following a targeted campaign to persecute the community, Amnesty estimates that more than 700,000 Rohingya were forced to flee their homeland and take refuge outside Myanmar – largely in neighbouring Bangladesh.

“Meta was profiting from the echo chamber of hatred created by its hate-spiralling algorithms,” Callamard added.

“Meta must be held to account. The company now has responsibility to provide reparations to all those who suffered the violent consequences of their reckless actions.”

The comments came as Amnesty International published a new report today (29 September) saying Meta either “knew or should have known” the extent to which its algorithms on Facebook played a role in spreading anti-Rohingya content.

Echo chamber of anti-Rohingya sentiment

Titled The Social Atrocity: Meta and the right to remedy for the Rohingya, the report says that Facebook in Myanmar became an “echo chamber of virulent anti-Rohingya content” in the months leading up to the atrocities in 2017.

“Actors linked to the Myanmar military and radical Buddhist nationalist groups systematically flooded the Facebook platform with incitement targeting the Rohingya, sowing disinformation regarding an impending Muslim takeover of the country and seeking to portray the Rohingya as sub-human invaders,” the report notes.

The mass dissemination of messages on Facebook that incited violence and discrimination against the Rohingya “substantially increased the risk of an outbreak of mass violence” and contributed to the incidents that followed, the report claims.

Amnesty International cited Facebook whistleblower Frances Haugen, who alleged in complaints last year that the social media company doesn’t take sufficient action regarding hate speech on its platform, “relegates international users” with its language capabilities and “promotes global division and ethnic violence”.

In an internal document from 2019 leaked by Haugen, and analysed by Amnesty, one Meta employee said there was “evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world”.

“We also have compelling evidence that our core product mechanics, such as virality, recommendations and optimising for engagement, are a significant part of why these types of speech flourish on the platform,” the employee added.

A report from human rights NGO Global Witness earlier this year also claimed that Facebook’s ability to detect Burmese language hate speech is “abysmally poor” and that it approved ads that contained forms of violent and “dehumanising” speech against the Rohingya

Call for reparations

Now, Amnesty International wants Meta to pay for its role in spreading hate speech.

The global human rights group launched a campaign calling for Meta to meet the Rohingya’s demands for remediation. The Rohingya previously requested Meta to fund a $1m education project in the refugee camp in Cox’s Bazar, Bangladesh – a request that Meta denied.

“Facebook doesn’t directly engage in philanthropic activities,” the company said at the time.

Callamard said that the latest findings should raise the alarm that Meta’s algorithm for Facebook and other apps still have the potential risk of contributing to similar human rights abuses in other parts of the world if the company does not make “fundamental changes” to its business model.

“Ultimately, states must now help to protect human rights by introducing and enforcing effective legislation to rein in surveillance-based business models across the technology sector,” she said.

“Big Tech has proven itself incapable of doing so when it has such enormous profits at stake.”

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Vish Gain is a journalist with Silicon Republic

editorial@siliconrepublic.com