As extensive unchecked data mining becomes a familiar concept, we must not settle into this as the new normal, writes Elaine Burke.
I dozed off twice watching The Great Hack.
That’s not a critique of the film or filmmakers. The very same thing happened while watching The Inventor: Out For Blood In Silicon Valley by the celebrated documentarian Alex Gibney. And, admittedly, on both of these occasions, I was dog-tired.
When I’m tired, the most likely thing to lull me to sleep is the motion of a bus, but sometimes just the TV will do it. Not everything on TV, really, but a football match or a snooker game can be a dead cert to push me into the zen zone that precedes a satisfying nap. The sports themselves are compelling, but the colours, the patterns of play, and the recognisable cast of characters wrap me in a cosy blanket of familiarity and off I go.
That’s what happened with The Great Hack. I knew this story and its patterns all too well. And while that familiarity was a comfort as I was snoring on my sofa, the reality I woke up to is that we are becoming jaded by these stories of Big Tech and its potential for manipulation.
The feedback I’ve heard on The Great Hack has been broadly positive. Friends and family who, unlike me, didn’t follow this story closely as it was breaking had a lot to learn from the Netflix documentary.
It tells the tale of the Cambridge Analytica scandal centred on a particular cast of characters: Prof David Carroll, who brought a case against Cambridge Analytica to retrieve the data they had on him; journalist Carole Cadwalladr, who famously broke whistleblower Christopher Wylie’s story; and Brittany Kaiser, the company’s former business development director who sought out and drafted the contract for Donald Trump’s 2016 US presidential campaign. Kaiser is now cooperating with investigations into the data-mining company.
At the time, the mainstream coverage of the Cambridge Analytica scandal did wonders for data literacy. As with The Great Hack, the story reached many people who weren’t conscious of how personal data is collected, aggregated and put to use online. Those who would normally clomp all over the internet with nary a care for what they shared were suddenly paying attention to their digital footprint.
The documentary opens with Carroll in his role as an associate professor at Parsons School of Design asking his class if any of them are suspicious that they are seeing ads based on what their phone could hear them discussing. Of course, there are plenty of raised hands.
It was a clever opening hook. That feeling of being surveilled by platforms on behalf of advertisers was and is everywhere.
But here we are a few years on from the scandal that rocked 87m Facebook accounts and things aren’t much different. Hyp3r, a trusted marketing partner of Facebook up until a few days ago, unabashedly built a business on what Cambridge Analytica had similarly sold: intense content targeting based on shadow profiles compiled from data users likely provided without their knowledge.
Perhaps the most unnerving part of the Hyp3r revelation – brought to light by a Business Insider investigation – was that data was being gathered from Instagram stories. A feature emulated from Snapchat to sate a growing appetite for impermanent online content, stories are meant to disappear after 24 hours. Yet here was a data miner saving the details for later. Not even your ephemeral content is safe from the scrape.
The danger of being jaded by terrifying tales of powerful, insidious technology scaling globally and unchecked is that we can sleepwalk into another Cambridge Analytica. And the more there are – and there will be more – the more we will be left desensitised and resigned to this online environment of marketing surveillance.
We must try not to find comfort in the familiar and stay alert to what’s evolving around us.