Facebook says that a group backed by the Kremlin appeared to be trying to influence US voters in the days prior to the midterm elections.
The crucial midterm elections in the US have seen some of the most hotly contested political races in US history. All eyes have been on social media platforms such as Facebook as they tackle the problem of interference, with mitigation measures put in place following the 2016 presidential race.
Yesterday (6 November), a few hours after most polls had closed, Facebook’s head of cybersecurity policy, Nathaniel Gleicher, said that it blocked more than 115 Facebook and Instagram accounts “due to concerns that they were linked to the Russia-based Internet Research Agency”. The accounts had been publicised a day earlier, but the company had not made the connection at that time.
Alleged influence over US election
Earlier in 2018, members of the same Russian group were indicted in the US over a major conspiracy, which alleged they were attempting to influence the 2016 election through a major multi-platform social media campaign.
Gleicher added: “This is a timely reminder that these bad actors won’t give up, and why it’s so important we work with the US government and other technology companies to stay ahead.”
The company originally did not specify the origin of the activity on Monday (5 November) but explained it still wanted to let users know what was happening as events unfolded.
Gleicher yesterday linked the Internet Research Agency to this coordinated behaviour, saying: “A website claiming to be associated with the IRA [Internet Research Agency] published a list of Instagram accounts they claim to have created.” He added: “We had already blocked most of these accounts yesterday, and have now blocked the rest.” While the link was not categorical, there was reason enough to make the association public.
US authorities had tipped off Facebook to suspicious behaviour that may be linked to a foreign entity the night before the midterm election day, according to a company blogpost. 85 of the removed accounts were posting in English on Instagram, while 30 other accounts were on Facebook itself, associating with pages using French and Russian.
Social media platforms under the microscope
In the run-up to the recent elections, both Facebook and Twitter have pulled down millions of posts and closed accounts linked to Russian operations, as well as other nation state actors located in Iran and other territories.
Given the fallout of the 2016 US presidential election, social media platforms have been funnelling cash and resources into content moderation tools, as well as boosting staff numbers to manage the misinformation issue.
The problems with Facebook and politics extend far beyond the US electoral system. Earlier this week, the company said it agreed with a report that found it failed to prevent its platform being leveraged to incite violence in Myanmar.
The independent report, commissioned by Facebook itself, said that while the company had made progress on the issues, there was still “more to do”. Product policy manager Alex Warofka said: “We have invested heavily … to examine and address the abuse of Facebook in Myanmar.”
Since the 2016 election, Facebook has hired 10,000 people to work on platform security and unveiled a “war room” where a team monitors disinformation campaigns at its California headquarters in October. As this new era of digital democratic interference continues to roll on, the response from social media firms will be keenly monitored by citizens and politicians alike.