Ofcom moves to protect children from ‘toxic algorithms’

9 May 2024

Image: © Svitlana/Stock.adobe.com

New draft rules have been revealed as part of the UK’s Online Safety Act to protect children from harmful content, but some parents believe they don’t go far enough.

Ofcom, the UK’s media and comms regulator, is pushing tech companies to make their services safer for children under new draft rules.

The regulator has published more than 40 practical measures in its draft Children’s Safety Codes of Practice, which detail how it expects online services to protect children. These measures include more robust forms of age verification and minimising children’s exposure to harmful content online.

The measures relate to the UK’s Online Safety Act, a set of rules focused on protecting children from online harm by placing more responsibility on tech companies to prevent and remove illegal and harmful content.

Ofcom’s draft rules would require online services that are at risk of containing harmful content to implement “highly effective age-checks” to prevent children from seeing it. Harmful content includes posts relating to suicide, self-harm, eating disorders and pornography.

Tech companies will also be required to ensure their algorithms filter out harmful content for children and don’t operate in a way that harms children. These young users also need to have a method to provide negative feedback to companies, so they can filter out content they don’t want to see.

“We want children to enjoy life online,” said Ofcom CEO Melanie Dawes. “But for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. That must change.”

“When we passed the Online Safety Act last year we went further than almost any other country in our bid to make the UK the safest place to be a child online. That task is a complex journey but one we are committed to, and our groundbreaking laws will hold tech companies to account in a way they have never before experienced,” said the UK’s technology secretary Michelle Donelan, MP.

The regulator claims these draft rules go much further than current industry practices and demand changes from tech companies operating in the UK. But the measures have faced some criticism.

The BBC reports that parents of children who died after seeing harmful online content have described the proposed rules as “insufficient”, while one parent said change is happening “at a snail’s pace”.

Ofcom said it will launch an additional consultation later this year on how automated tools such as AI can be used to “proactively detect” illegal and harmful content.

Find out how emerging tech trends are transforming tomorrow with our new podcast, Future Human: The Series. Listen now on Spotify, on Apple or wherever you get your podcasts.

Leigh Mc Gowran is a journalist with Silicon Republic

editorial@siliconrepublic.com