Instagram is connecting large paedophile networks, report claims

8 Jun 2023

Image: © natanaelginting/Stock.adobe.com

Meta said it has set up an internal task force to investigate and address these claims, while the report also claimed Twitter’s CSAM scanning ‘broke’ briefly.

A new report has listed Instagram as the “primary platform” for the spreading of child sexual abuse material (CSAM) online.

The Stanford Internet Observatory (SIO) claims its investigation found large networks of accounts “purportedly operated by minors”, which sell self-generated illicit sexual content. The investigation looked at various platforms but claimed Instagram is the most “important” for these networks.

The Meta-owned site is allegedly being used more due to its recommendation algorithms and direct messaging feature, which the report claims connects buyers with sellers. The report also said Instagram’s popularity and “user-friendly interface” makes it a preferred option.

The report claimed Instagram recommendation algorithms “effectively advertise” CSAM content, as these algorithms “analyse user behaviors and content consumption to suggest related content and accounts to follow”.

The SIO said a tip from The Wall Street Journal led to the investigation, with the newspaper publishing its own article on the findings yesterday (7 June).

“Sellers are often familiar with audience growth and ban-evasion techniques,” SIO said in a Twitter post. “Instagram is by far the most popular platform, but this is a widespread issue.”

A Meta spokesperson told SiliconRepublic.com that the company has set up an internal task force to investigate these claims and “immediately address them”.

“Child exploitation is a horrific crime,” the spokesperson said. “We work aggressively to fight it on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it.”

“Predators constantly change their tactics in their pursuit to harm children, and that’s why we have strict policies and technology to prevent them from finding or interacting with teens on our apps.”

Twitter and Telegram

The investigation focused on Instagram but noted that other platforms are being used in the sale of CSAM content.

The SIO report claimed that Twitter had an “apparent and now resolved regression” which briefly allowed CSAM content to be posted on public profiles. Alex Stamos, one of the authors of the SIO report, said on Twitter that the site’s basic scanning for CSAM “broke” after Elon Musk took over the company last year “and was not fixed until we notified them”.

Stamos also said that Twitter continues to have “serious issues with child exploitation”. Meanwhile, the SIO report claimed that Telegram “implicitly” allows trading of CSAM material through private channels.

The report called for an “industry-wide” initiative to limit the production, discovery, advertisement and distribution of this content and noted that some platforms could not be analysed “in-depth using open-source methods”.

“These networks utilise not only social media platforms, but file sharing services, merchants and payment providers,” the report said. “Given the multi-platform nature of the problem, addressing it will require better information sharing about production networks, countermeasures and methods for identifying buyers.”

Last month, leaked documents suggest many EU countries are in favour of scanning encrypted messages to prevent the spread of CSAM, with Spain supporting a ban on end-to-end encryption.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Leigh Mc Gowran is a journalist with Silicon Republic

editorial@siliconrepublic.com