The UK said thousands of predators could go undetected if Meta brings encryption to its platforms, while the country moves to make its Online Safety Bill law.
The UK government is urging Meta not to roll out end-to-end encryption on its platforms, unless it has “robust safety measures” to protect children from sexual abuse.
The government claims encryption could significantly reduce the effectiveness of UK authorities in detecting predators online. The country’s National Crime Agency claimed 92pc of Facebook Messenger and 85pc of Instagram Direct referrals could be lost if encryption is rolled out, which means “thousands of criminals a year could go undetected”.
Meta already has end-to-end encryption in place for its WhatsApp messaging service, which is designed to prevent third parties from seeing content in conversations. Meta shared plans last month to bring end-to-end encryption to Messenger by the end of this year.
The UK claims its push to have Meta install safety measures – or halt its encryption plans – is being supported by organisations such as the NSPCC, the Marie Collins Foundation and the Internet Watch Foundation.
UK home secretary Suella Braverman, MP, said Meta has failed to give assurances that they will keep their platforms safe “from sickening abusers”.
“[Meta] must develop appropriate safeguards to sit alongside their plans for end-to-end encryption,” Braverman said. “The use of strong encryption for online users remains a vital part of our digital world and I support it, so does the government, but it cannot come at a cost to our children’s safety.
“I have been clear time and time again, I am not willing to compromise on child safety.”
A Meta spokesperson told The Guardian that encryption is used to keep the UK population safe from “hackers, fraudsters and criminals”. This spokesperson also said the company has been developing safety measures over the past five years to combat abuse “while maintaining online security”.
The Online Safety Bill
The push against Meta’s encryption plans comes shortly after the UK passed its long-debated Online Safety Bill, in an attempt to introduce “the most powerful child protection laws in a generation”.
This Bill aims to crack down on the possession and sharing of child sex abuse material (CSAM) on popular privacy-focused messaging apps. Once it becomes law, it will give regulators the power to direct private companies to deploy technology that bypasses encryption and scan for CSAM on phones.
While the UK government argues that the added powers will prevent further distribution of CSAM and other illegal content, messaging services and some rights groups have been strongly opposed to its potential privacy violations.
Leaked documents earlier this year indicate that the EU is considering similar rules in preventing the spread of CSAM, by forcing tech companies to scan user content for CSAM.
10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.