UK could force tech firms to scan for child abuse images – or face fines

6 Jul 2022

Image: © kwanchaichaiudom/Stock.adobe.com

Tech firms that fail to meet their responsibilities could face fines of up to £18m or 10pc of their global annual turnover.

The UK government is proposing changes to its online safety bill that could make tech companies scan for and remove child sexual abuse material (CSAM), or face significant fines.

The amendment would give the UK’s communications watchdog, Ofcom, additional powers to help in its oversight and enforcement role. For example, Ofcom could make a messaging platform use “highly accurate technology” to scan public and private channels for CSAM.

This particularly affects private messaging platforms such as Meta’s WhatsApp and Messenger services. The UK Department for Digital, Culture, Media and Sport said the use of this power would be subject to “strict safeguards” to protect the privacy of users.

Under the amendment, Ofcom would have to be sure that no other measures would be similarly effective and that there is evidence of a widespread problem on a service.

However, “all platforms in scope” such as social media platforms, forums, messaging apps and search engines will have to take measures to tackle illegal content on their services.

Ofcom will have the power to ensure tech companies take “particularly robust action” to tackle issues such as child sexual abuse and exploitation online. Tech companies will also need to have clear and accessible ways for users to report harmful content or challenge a wrongful takedown.

Firms that fail to meet their responsibilities could face fines of up to £18m or 10pc of their global annual turnover, whichever is higher.

“The regulator will have the powers necessary to take appropriate action against all companies in scope, no matter where they are based,” the UK department said in a statement. “This is essential given the global nature of the internet.”

There are calls from experts for the UK government to flesh out the details on the scanning technology Ofcam can impose on companies, with concerns that it could be misused to conduct surveillance, the BBC reported.

Apple had proposed a set of tools for the detection of CSAM last August, but the roll-out was postponed following concerns from critics.

Following changes to its proposals, Apple released a safety feature designed to warn children when they receive or send photos that contain nudity. This feature was launched in the US last December.

Targeting disinformation

The proposed amendment also targets disinformation, requiring social media companies to proactively look for and remove disinformation from foreign state actors that harms the UK.

The digital department said that people are concerned about the threat of malicious state-linked disinformation, particularly following Russia’s invasion of Ukraine.

Under the amended bill, tech companies will have to take “proactive, preventative action” to minimise people’s exposure to state-linked disinformation aimed at interfering with the UK.

“Disinformation is often seeded by multiple fake personas, with the aim of getting real users, unwittingly, then to ‘share’ it,” said UK security minister Damian Hinds.

“We need the big online platforms to do more to identify and disrupt this sort of coordinated inauthentic behaviour. That is what this proposed change in the law is about.”

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Leigh Mc Gowran is a journalist with Silicon Republic

editorial@siliconrepublic.com