Facebook used a pilot project to try and deradicalise people with extremist views on Facebook Messenger.
Big tech companies have been under fire of late for the radicalisation and propagation of extremist views that can often take place on their platforms.
The EU has had measures in place to monitor the progress of firms such as Facebook and Twitter for several months now, but it looks like Facebook has been piloting a separate project to counter radicalisation on its own platform.
According to the BBC, counter-extremism organisation Institute for Strategic Dialogue (ISD) led a pilot project on Facebook Messenger designed to challenge the views of people posting far-right and Islamist content on the platform.
Counter-terrorism efforts from Facebook
The ISD said it used software to scan extremist pages for targets and manually looked at their respective profiles for instances of violent and hateful language. 569 people were contacted in total, and 76 of those people had conversations of five or more messages.
Researchers claim that eight participants felt a positive impact as a result of the discussions. 11 so-called ‘intervention providers’ were employed to carry out the pilot project, many of them former extremists, survivors of terrorist incidents or trained counsellors.
Colin Bidwell, a survivor of 2015’s terror attack in Tunisia, spoke to people under a fake profile about their beliefs. Chief executive of the ISD, Sasha Havlicek, said the aim of the project was to “walk them [extremists] back from the edge, potentially, of violence”.
Havlicek added: “There’s quite a lot of work being done to counter general propaganda with counter-speech and the removal of content, but we know that extremists are very effective in direct messaging.”
There has been some concern from privacy advocates about the ethics of the project, particularly as Facebook funded an initiative breaking one of its major rules by creating fake accounts.
A solicitor at the Privacy International charity, Millie Graham Wood, said: “If there’s stuff that they’re identifying that shouldn’t be there, Facebook should be taking it down.
“Even if the organisation [ISD] itself may have been involved in doing research over many years, that does not mean that they’re qualified to carry out this sort of … surveillance role.”
The ISD workers did not directly disclose that they were part of a project unless they were directly asked, which did happen on several occasions.
The group now wants to examine the project’s potential for other platforms such as Reddit and Instagram.
Both far-right and Islamist groups are using Facebook for their recruitment process. Wired reported recently on an organised network of Facebook pages to target British users with right-wing content.
Earlier in February, Facebook said it would fight extremist content after business giant Unilever threatened an advertising boycott unless the epidemic of abusive content was tackled.