I Work In A High School – Here's Why Instagram Teen Accounts Aren't As Good As They Seem

These measures aren’t quite what they appear to be.

This week, Meta announced that a new feature is being introduced: Instagram Teen Accounts. According to the social media giants, this new feature is aimed at those under 16 as “built-in protection for teens, peace of mind for parents”.

In a statement, Meta said: “We know parents want to feel confident that their teens can use social media to connect with their friends and explore their interests, without having to worry about unsafe or inappropriate experiences.

“We understand parents’ concerns, and that’s why we’re reimagining our apps for teens with new Teen Accounts.”

Restrictions on teen accounts will include default private accounts for anyone under 16, the strictest messaging restrictions and time limits which will tell teens to leave the app after 60 minutes.

However, HuffPost UK spoke with one expert about why this just isn’t enough protection.

We need more robust measures to protect teens

Deborah Gallacher, Deputy Rector at Kelvinside Academy, who is responsible for pastoral care at the 600-pupil Glasgow school told us: “It is positive to hear that Instagram has acknowledged there needs to be more protection online for young people, and the introduction of a teenage account could be a step in the right direction.”

However, Gallacher believes that these measures aren’t quite what they appear to be, and the responsibility isn’t being taken by Meta.

She added: “The proposed measures continue to put the onus on young people – and now, by extension, their parents – for their experiences and interactions on Instagram.

“Education and greater controls have important roles to play in safeguarding young people and empowering them to make better decisions online, and we need continued and more robust measures in this area.”

Gallacher also believes that this isn’t enough to tackle the underlying issue of harmful content being present on social media: Ultimately, Meta must take more corporate responsibility for the harmful content and algorithm activity young people are potentially exposed to on its platforms.

“Failing to do this, and instead simply allowing parents greater access to their child’s account, seems tokenistic and ultimately ineffective when set against the scale of the potential harm social media poses.”

HuffPost UK also spoke with Rani Govender, Online Child Safety Policy Manager at the NSPCC, who said: This move from Instagram is a step in the right direction and appears to be a response to Ofcom’s codes for protecting children, an indication that the Online Safety Act is beginning to have an impact.

However, Govender agrees with Gallacher’s sentiments and added: “Safer settings ultimately still put the emphasis on children and parents needing to keep themselves safe.

“This must be backed up by proactive measures that prevent harmful content and sexual abuse from proliferating Instagram in the first place, so all children have the benefit of comprehensive protections on the products they use.”

Learn more about the new Teen Accounts here.

Close