EU proposes new rules to tackle online child sexual abuse content

11 May 2022

EU commissioner for home affairs Ylva Johansson in 2019. Image: European Parliament (CC-BY-4.0)

The proposed rules would make tech companies assess and reduce the risk of child sexual abuse material appearing on their platforms.

The European Commission has proposed a set of new rules to tackle child sexual abuse content online, which has raised concerns by privacy and digital rights activists.

The proposed legislation shared today (11 May) would require online platforms to detect, report and remove child sexual abuse material on their services. These platforms would also have to assess and mitigate the risk of misuse when it comes to their services.

A new EU Centre on Child Sexual Abuse will be established to help tech companies comply with the new rules and receive reports of online child abuse. This centre will also assist law enforcement and Europol by reviewing the reports from providers before sending on information.

“Today’s proposal sets clear obligations for companies to detect and report the abuse of children, with strong safeguards guaranteeing privacy of all, including children,” EU commissioner for home affairs Ylva Johansson said.

“Child sexual abuse is a real and growing danger: not only is the number of reports growing, but these reports today concern younger children. These reports are instrumental to starting investigations and rescuing children from ongoing abuse in real time.”

The European Commission said the current system based on voluntary detection and reporting by companies has been “insufficient to adequately protect children”. It added that 95pc of all child sexual abuse reports in 2020 came from one company, despite evidence that the issue exists on multiple platforms.

The issue has been heightened amid the Covid-19 pandemic and the Internet Watch Foundation noted a 64pc increase in the number of actioned reports on child sexual abuse in 2021 compared to 2020.

However, the proposals have raised privacy concerns as digital platforms may have to scan through user content and private messages in order to detect child sexual abuse material or create an encryption backdoor.

European Digital Rights policy adviser Ella Jakubowska told Politico that the idea of people in the EU having their private messages “indiscriminately” scanned at all times is “unprecedented”.

There are concerns the proposals could lead to backdoors to circumvent end-to-end encryption, a method of secure communication that prevents third parties from accessing messages.

The European Commission explained in a Q&A on the new rules that a large portion of child sexual abuse reports come from services that are already encrypted or may become encrypted in future. It added that if these services were exempt, the consequences “would be severe for children”.

Software engineer and consultant Alec Muffett said on Twitter that the EU has declared “war on end-to-end encryption”.

The new rules will now be discussed within the European Parliament and Council. If passed, they will replace the current interim regulations.

The proposed regulation comes a month after the landmark Digital Services Act was approved, which aims to make the internet safer with new rules for all digital services.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

EU commissioner for home affairs Ylva Johansson in 2019. Image: European Union/European Parliament via Flickr (CC by 4.0)

Leigh Mc Gowran is a journalist with Silicon Republic

editorial@siliconrepublic.com