Social media giant Meta continues to take important steps to ensure the safety of its young users. Meta, which in 2021 prevented users under the age of 18 from receiving messages from adults they do not follow, is now expanding this rule to better protect young people from unwanted communication. Now, individuals under the age of 16 or 18, depending on users’ countries, will no longer be able to receive direct messages (DMs) by default from people they do not follow. This restriction will apply even among young people.
This new security measure will be applied to both Instagram and Messenger. On Messenger specifically, younger users will only be able to receive messages from their Facebook friends or people in their phone contacts. Since this setting is enabled by default, teens with accounts under parental supervision will need to get their guardian’s approval to change the settings. Of course, this setting will depend on the user’s declared age and the technology Meta has designed to estimate people’s ages, so it won’t be 100% foolproof.
“We want young people to have safe and age-appropriate experiences in our applications,” Meta said in a statement. Earlier this month, Meta announced that it would begin hiding harmful content aimed at young people on Instagram and Facebook. A user under 16 will not be able to see posts about self-harm, violence, eating disorders and other harmful topics in their Feed and Stories, even if they are shared by accounts they follow. It also introduced an awareness feature that sends “nightly alerts” to teens under the age of 18 to close the app and go to bed if they browse the app for more than 10 minutes.
Meta made these changes after facing lawsuits and complaints about how it maintained its young user base. An unsealed lawsuit filed against the company by 33 states accuses it of actively targeting children under 13 to use its apps and websites and continuing to collect their data even though it knew their age. Additionally, a Wall Street Journal report alleged that Instagram was serving “suggestive images and overtly sexual adult videos” of children to accounts that follow young influencers. In December 2023, the state of New Mexico sued Meta, claiming that Facebook and Instagram algorithms recommended sexual content to minors. And earlier this month, The Wall Street Journal reported on graphic internal Meta presentations related to this case. Apparently, employees estimated that 100,000 child users were being harassed daily on Facebook and Instagram, highlighting the need for stricter measures on their platforms.
Meta’s efforts to protect young users
These new security measures for Meta’s young users aim to reduce the risks that children and young people may face online. Negative events and increasing security concerns on social media platforms push companies to constantly develop new strategies to ensure the safety of their users. These steps from Meta will allow young people to navigate the digital world more safely and help parents better manage their children’s online interactions. These developments show that Meta prioritizes the well-being of its young users and takes proactive measures to protect them.