The social media giant is banning any pages, groups and Instagram accounts representing QAnon, and will ‘proactively detect content for removal’.
Facebook has said it will strengthen its measures to tackle the radical, far-right conspiracy movement QAnon on its platforms.
In August, the company published a statement outlining how it would disrupt the ability of QAnon and militarised social movements to organise on Facebook and Instagram. Since then, it has removed more than 1,500 pages and groups related to QAnon that contained “discussions of potential violence”.
In an update yesterday (6 October) Facebook said any pages, groups and Instagram accounts representing QAnon will taken down, even if they don’t contain violent content.
The company’s dangerous organisations operations team will enforce the new policies and “proactively detect content for removal instead of relying on user reports”, it said.
Why the update?
A number of incidents between August and now have motivated Facebook to take further action against QAnon. Though it initially focused on removing QAnon content supporting violence, the company said that it has since seen “other QAnon content tied to different forms of real-world harm”.
This has included claims that certain groups started the recent wildfires along the west coast of the US which, according to Facebook, distracted attention of local officials who should have been fighting the fires.
“Additionally, QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another,” Facebook said. “We aim to combat this more effectively with this update that strengthens and expands our enforcement against the conspiracy theory movement.”
Another amendment to Facebook’s policy includes directing users to reliable child safety resources when they search for child safety hashtags, which have been taken over by QAnon groups. Facebook said it “expects renewed attempts” by QAnon to evade detection and will update its policy and enforcement as needed.
Action against QAnon
The origins of conspiracy group QAnon in the US can be traced back to 2017. Since then it has cultivated a following known for spreading disinformation online, focusing on so-called ‘deep-state conspiracies’ such as public figures engaging in child sex abuse and exploitation.
Other social media platforms have also taken action against QAnon. In July, Twitter said it was permanently suspending the accounts of users discussing QAnon topics through multiple accounts, coordinating abuse or trying to evade prior suspensions.
Facebook has also taken action against other online violence and disinformation campaigns recently. In July, it announced that any content promoting violence from a network called ‘boogaloo’ would be banned across its platforms. It removed around 300 Facebook and Instagram accounts, more than 400 additional groups and more than 100 pages in the purge.
The platform also removed a post from Donald Trump yesterday in which the US president claimed Covid-19 was less dangerous than the flu. Facebook’s policy communications manager, Andy Stone, said the post was taken down because the platform “removes incorrect information about the severity of Covid-19”.