The risks of far-right content falling back to the Fediverse

20 Mar 2023

Image: © Andrii Yalanskyi/Stock.adobe.com

ISD Global’s Ciarán O’Connor explains how decentralised platforms are being used to store and spread extremist content, along with the difficulties presented in tackling this content.

Efforts to manage extremist content online have come to a head in recent months, as governments and organisations move to curb its growth.

Towards the end of 2022, Europol and 13 EU member states – including Ireland – conducted an international operation to flag far-right extremist and terrorist content online, resulting in 831 detected items across 34 platforms.

The European Commission also ramped up efforts in January, ordering 22 member states to take extra steps in tackling the spread of terrorist content. Ireland, meanwhile, is in the process of forming an anti-disinformation strategy to deal with “organised campaigns of manipulation of internet users”.

But despite the efforts to remove extremist content, there is evidence that these groups are simply shifting their efforts to smaller, decentralised platforms to avoid content takedowns. These interconnected social platforms are known as the Fediverse.

A recent report by the non-profit Institute for Strategic Dialogue (ISD) claims that right-wing extremist actors are moving to decentralised video platforms such as Odysee and PeerTube, due to hate and conspiracy style content being removed from mainstream platforms like YouTube.

The report focuses on far-right content in Germany, but ISD senior analyst Ciarán O’Connor told SiliconRepublic.com that a similar trend has been observed in Ireland.

One example O’Connor gave is video content being used to spread false or misleading claims about asylum seekers in Ireland, along with “content that potentially incites hatred and distrust in the process”.

“There’s no one single platform that your standard Irish far-right activist might use,” O’Connor said. “They’ll use different platforms for different reasons.”

O’Connor said these decentralised spaces are becoming more useful for Irish far-right groups because there’s “an awareness” that if they post certain content onto mainstream platforms, their accounts will likely get strikes or banned for breaching the platform’s rules.

“The utility of decentralised platforms is there’s often very few community guidelines or – purely because of the technology underlying – content is next to impossible to totally remove from those platforms,” O’Connor said.

Last year, a group of UK scientists argued that the crack down on social media misinformation posed the risk of driving people towards “harder-to-access corners of the internet”.

Safe spaces for extremist groups

O’Connor said another benefit that these decentralised platforms provide to extremist groups is a location where the level of scrutiny is reduced.

He said the content posted on these platforms isn’t under the spotlight to the same degree as mainstream platforms such as YouTube or Facebook, making it easier for these groups to share content.

The ISD Global report looked at Odysee and PeerTube, which are two decentralised platforms. PeerTube offers free software for users to create their own “mini YouTubes” for sharing specific content.

The report said Odysee also lets users monetise their content and “store it on decentralised servers so that it is practically impossible to delete”.

The report said extremist content found on these platforms included conspiracy theories around topics such as the Covid-19 pandemic and Russia’s invasion of Ukraine. There were also videos on Holocaust denial and a livestream of the 2019 Christchurch far-right terrorist attack.

“Neither Odysee, nor PeerTube are right-wing extremist in and of themselves,” the report said. “However, the technology used and affordances offered by these video services makes them attractive for right-wing extremist groups which are blocked on larger social media platforms.”

These platforms also pose a benefit for these far-right groups when they try to share content on mainstream platforms, as videos and images become harder to fully remove if they’re coming from decentralised platforms.

O’Connor said harmful or violent content can usually be removed faster if it’s uploaded directly to a platform such as Facebook or YouTube, but becomes more complicated when third party links are involved.

The variety of decentralised platforms available and the blockchain technology behind some of these sites also makes it very difficult to remove completely.

“You’ll see extremist figures, they might create a video about some recent protest and they’ll put it on Odysee, but they’ll also put it on BitChute and they also might put it on Rumble and there’s a kind of multi-platform approach so that, a lot of these platforms are kind of archives or host spaces as well,” O’Connor said.

Countering decentralised content

Overall, the ISD Global report highlights difficulties associated with countering the spread of this extremist content on decentralised platforms.

The report said it is unclear if the EU’s upcoming Digital Services Act will be applicable to some of these platforms, as not all of them operate with a profit motive and could be exempt from the act.

The report also noted that even if individual extremist instances are isolated, the videos “are still accessible so long as they have a hosting provider and other essential internet structures”.

One example where extremist content was successfully removed was with Kiwi Farms, according to the report. This 8chan-style internet forum was infamous for harassment and targeted threats towards various social groups, including the transgender community.

Last year, Cloudflare moved to block its security services to the online forum, citing an “immediate threat to human life”.

“This is an extraordinary decision for us to make and, given Cloudflare’s role as an internet infrastructure provider, a dangerous one that we are not comfortable with,” Cloudflare CEO Matthew Prince wrote at the time.

The report noted that, while the decision was effective against Kiwi Farms, switching off entire websites is a tactic that can be misused by authoritarian regimes for censorship efforts.

“Additionally, content that is illegal or anti-constitutional often only makes up a fraction of the entire content on an instance,” the report said. “Therefore, entirely blocking access to the instance would only rarely be proportional and justified.”

Regardless of the potential measures that could be taken, the ISD Global report said that governments and organisations “must address” the changes occurring within the far-right extremist ecosystem online.

“This means clarifying which platform architectures are truly decentralised and which only market themselves as such to skirt responsibility and legal accountability,” the report said. “It also means finding out which opportunities exist for regulatory and non-regulatory countermeasures.”

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Leigh Mc Gowran is a journalist with Silicon Republic

editorial@siliconrepublic.com