Instagram reveals private ‘teen accounts’ in online safety push

17 Sep 2024

Image: © ChayTee/Stock.adobe.com

The platform is being praised for these new privacy-focused accounts, but these changes come after years of whistleblowers, lawsuits and hearings criticising Meta’s efforts to protect children and teens.

Meta has unveiled new features to keep young people safe on Instagram, as countries around the world look to protect children on social media.

The tech company says these Instagram teen accounts are designed to address some of the key issues parents have with its platform. Teenagers will automatically be placed into these accounts, which are designed to limit who can contact them and the content they see.

Meta says these teen accounts are private by default – users need to accept new followers and accounts that don’t follow them can’t see their content or interact with them. Teen accounts can also only be messaged by people they follow or are connected to.

These accounts also have settings to limit addiction – another key concern raised by parents and government officials. Teenagers on these accounts will get notifications telling them to leave the app after 60 minutes each day. These accounts also have a ‘sleep mode’ function, which mutes notifications and direct messages between 10pm and 7am.

Yvonne Johnson, the president of the US National Parent Teacher Association, said parents are “grappling with the benefits and challenges of the internet and digital media for their teens”.

“With teens automatically placed in Teen Accounts and certain privacy settings turned on by default, this update demonstrates that Meta is taking steps to empower parents and deliver safer, more age-appropriate experiences on the platform,” Johnson said.

Whistleblowers, lawsuits and hearings

These Instagram teen accounts are more reactive than proactive however – Meta has been criticised for years when it comes to how it handles child safety on its platforms. The notorious Facebook Files released in 2021 by whistleblower Francis Haugen suggested the company knew Instagram to be damaging to the mental health and wellbeing of teenage girls.

These claims were backed up by another Meta whistleblower – Arturo Béjar, who worked for Meta as a consultant to support Instagram’s wellbeing team in 2019. Béjar claimed that the company was fully aware of the harm its platforms cause young users and that their policies do little to prevent that harm.

Since then, legal barrages have been hitting Meta in both the US and the EU. The EU is currently investigating whether Meta’s algorithms stimulate addictive behaviours in children.

In the US, Meta is in the middle of a massive federal lawsuit involving dozens of US attorneys general, which accuses the tech giant of harmful actions against children and teenagers. This lawsuit includes multiple social media companies such as YouTube, Snapchat and TikTok.

These four companies are also being sued by New York City, which aims to hold them accountable for their alleged “damaging influence” on the mental health of children.

At the end of January of this year, five heads of social media companies were grilled in a US hearing that focused on Big Tech companies and the “online child sexual exploitation crisis”. During this hearing, US senators called for new laws to hold social media companies accountable, issued scathing criticisms to Big Tech CEOs and pushed Meta CEO Mark Zuckerberg to apologise to affected families.

In a recent podcast, Zuckerberg said he was done apologising and that he had made a “20-year mistake” of taking responsibility for issues where he believes Meta was not to blame.

Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.

Leigh Mc Gowran is a journalist with Silicon Republic

editorial@siliconrepublic.com