Ahead of the 2020 US election, researchers will look at how Facebook and Instagram users are affected by newsfeeds and targeted ads – and how this may impact the political climate.
On Monday (31 August), Facebook announced a new project examining the company’s role in political participation, political polarisation, knowledge and trust in US elections.
The project is being led by independent researchers Prof Talia Stroud and Prof Joshua A Tucker, with around 15 other external academics who will work with Facebook’s own researchers. The goal of the project is to make scientific assessments of Facebook and Instagram’s impact on political behaviour and attitudes, producing findings that will be readily available to the public.
The news was announced by Chaya Nayak, who is head of Facebook’s open research and transparency team, along with former deputy UK prime minister Nick Clegg, who is now Facebook’s vice-president of global affairs and communications.
In a blogpost, Nayak and Clegg outlined how Facebook has become “a stage for democratic debate”, which gives the social media company “a big responsibility” when it comes to elections. As a result, the company is now building on a political research initiative it launched in 2018.
Plans for the research
Nayak and Clegg said the new project will examine the impact of how users interact with content on Facebook’s newsfeed and content across Instagram during the 2020 US election, along with the role of features like content ranking systems.
“Three principles guide our work, and will continue to do so as we move ahead: independence, transparency and consent,” they wrote.
The external researchers will not be paid by Facebook and, according to the company, they “won’t answer to Facebook either”. The company said it will not restrict the questions researchers ask or the conclusions that they draw.
When the researchers publish their findings, they will do so in open-access format so the data is freely available to the public. The company is asking for explicit, informed consent from users who opt to be part of the research, which may analyse individual-level data.
Stroud and Tucker, along with Facebook researchers Annie Franco and Chad P Kiewiet de Jonge, wrote a separate blogpost outlining the motivations for the research and how they aim to keep it transparent.
When users opt into the study, researchers will split them into groups and begin experimenting with their newsfeeds, targeted advertisements and, in some cases, will ask the users to stop using Facebook temporarily.
While this takes place, the research team will poll participants on their experiences to learn about how their viewpoints evolve and how they compare to control groups.
According to the researchers, Facebook will not have any right of pre-publication approval and will only be entitled to check that papers do not violate legal or privacy obligations.
In the research team’s blogpost, the authors wrote: “We know everyone will be anxious to see the results. As we will need time to analyse the data, we anticipate that findings will be ready to be shared in the summer of 2021 at the earliest.
“We hope that the effort will be judged as worthwhile once the studies, designed to make sense of what has been a remarkable transformation of the political process in the digital information age, are complete.”