Facebook’s Sheryl Sandberg apologises for creepy user study

3 Jul 2014

Facebook chief operating officer Sheryl Sandberg

Facebook’s secret study of 700,000 users’ moods by inserting positive and negative content into their News Feed without their knowledge was “poorly communicated”, said chief operating officer Sheryl Sandberg.

Sandberg made the comment during a trip to India to promote the use of the social network’s platform by small businesses.

“This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated,” Sandberg was quoted in the Wall Street Journal as saying.

“And for that communication we apologise. We never meant to upset you.”

Revelations last weekend that in 2012 Facebook data scientists tampered with what 700,000 saw in their News Feeds with a view to seeing how it affected their moods caused uproar.

The ‘Emotional Contagion’ research was legal because every Facebook user consents to relinquishing the use of their data for “data analysis, testing and research” as part of the 9,000-plus words in Facebook’s Terms and Conditions.

The study came to light when it was published in the prestigious Proceedings of the National Academy of Sciences.

The implications are that by putting positive content in a users’ News Feed they will post positive updates and by posting negative content they will respond with negative updates. Post emotional content in their News Feed and they are most likely to post, comment and share and post bland content and they won’t post at all.

Ultimately the key learning for Facebook is what kind of content – aka advertising – may trigger an emotional response – but for conspiracy theorists and those who fear a dystopian, Orwellian future, the implications are creepier.

But it turns out the study was only one of many by Facebook’s data science team. According to the Wall Street Journal the team consists of three dozen researchers who have unique access to the social networks’ 1.3bn user base.

A former data scientist from the team Andrew Ledvina said that anyone on the data science team could run any kind of test and were always trying to alter people’s behaviour, in some cases without telling the company.

Typical studies ranged from anti-fraud measures to studying how families communicated, the causes of loneliness and how political mobilisation messages affected elections.

Right now, as we enter a world where big data will rule, George Orwell must be turning in his grave.

John Kennedy is a journalist who served as editor of Silicon Republic for 17 years

editorial@siliconrepublic.com