Starting this week, Instagram is starting to include new and existing users under the age of 18 in its more specific and restrictive “Teen Accounts” system. This new type of account will change the way millions of young people around the world interact with the platform. While users 16 and older will be able to relax certain settings, certain protections will automatically be in place for all young users.
With the new system, all teen accounts on Instagram will be private by default. This won’t just be limited to users under the age of 16, but will apply to all teens under the age of 18. The current restrictions that prevent strangers from sending direct messages will also continue to apply. In addition, new features will be introduced, such as a Sleep Mode that turns off notifications between 10 p.m. and 7 a.m.
“This standardizes, simplifies, and makes a huge part of what we do applicable to all young people, providing a comprehensive set of protections for young users,” Antigone Davis, Meta’s global head of security, told The Verge in an interview.
Instagram launches new content recommendation system for teens
Instagram is also making the content that younger users can see age-appropriate. Teens will be able to choose the topics they want to see in suggested content, such as “sports,” “animals,” or “travel.” The types of content on the Reels or Explore pages will also be limited for younger users. The app will also send alerts reminding teens to take breaks.
Alongside these changes, Instagram is also updating its parental controls features. Parents will be able to see who their kids have been messaging over the past seven days (without seeing the content of the messages) and will also be able to see the threads their kids are viewing most often.
While teens over the age of 16 will be able to change these settings, teens under the age of 16 will need parental consent to make their accounts public. Parents will need to enable Instagram’s moderation tools to approve this change.
Teen Accounts will be rolling out gradually to users in the US, UK, Australia, and Canada first. New teens will see the change immediately, while existing users will receive the updates in about a week. Meta plans to bring the features to the European Union by the end of the year and expand to its other platforms in 2025.
But despite these protections, there are questions about how effectively Meta can enforce these settings. “We know that some young people will try to misrepresent their age to avoid these protections, so we’re building new opportunities for young people to verify their age,” Antigone Davis said. When users try to change their age to 18 or older, they already have to take a video selfie, upload their ID, or verify the age of other users, but Instagram is taking these systems even further.
The platform will now be able to use AI to detect when a user is under the age of 18. For example, if a user declares that they are 18 when they sign up for an account, but another user tells them “happy 14th birthday,” Instagram can use that information to determine the user’s real age. “Age detection is often a difficult process, so we need to take a multi-layered approach because there is no foolproof method for this,” Davis said.
Lawmakers have been taking a tougher stance on social media platforms since Facebook whistleblower Frances Haugen leaked the company’s research on teen mental health in 2021. Instagram has rolled out a number of child safety features over the past few years and rolled out parental controls in 2022. The platform has also agreed to help researchers study mental health effects on teens and young adults.
Despite these developments, lawmakers remain concerned. In the US, nearly 40 states support the surgeon general’s proposal to add warning labels to social media platforms. In July, the Senate passed a landmark bill that would improve child safety online.