Instagram is tightening the settings on its "teen accounts" to add new limits on what kids on the platform are able to see. Older teens will also no longer be able to opt out of the default stricter settings without parental approval.
Meta first introduced teen accounts for Instagram a year ago, when it began automatically moving teens into the more locked-down accounts that come with stricter privacy settings and parental controls. The company recently rolled out the accounts for teens on Facebook and Messenger too, and has used AI tools to detect teens that are lying about their age.
While teen accounts are meant to address long-running criticism about Meta's handling of teen safety on its apps, the measures have been widely criticized as not going far enough to protect the company's most vulnerable users. A recent report from safety advocates at Heat Initiative found that "young teen users today continue to be recommended or exposed to unsafe content and unwanted messages at alarmingly high rates while using Instagram Teen Accounts." (Meta called the report "deeply subjective.")
Now, Meta is locking down teen accounts even more. With the latest changes, teens will no longer be able to follow or see content from accounts that "regularly share age-inappropriate content" or that seem "age-inappropriate" based on their bio or username. Meta says it will also block these accounts from appearing in teens' recommendations or in search results in the app.
Instagram will also block a "a wider range of mature search terms" for teens, including words like "alcohol,” "gore," and intentional misspellings of these words, which is a common tactic to avoid Instagram's filters. And, even if an account a teen already follows shares a post that goes against these rules, teens should be prevented from seeing it, even if it's sent to their DMs.
While these changes may seem like Meta once again filling somewhat obvious gaps in its safety features, the company says the revamp is meant to make the content teens encounter on Instagram more like a PG-13 movie. "Just like you might see some suggestive content or hear some strong language in a PG-13 movie, teens may occasionally see something like that on Instagram – but we’re going to keep doing all we can to keep those instances as rare as possible," the company explained in a blog post.
That's a somewhat confusing analogy as there's a fairly wide spectrum of what might appear in a PG-13 movie. Meta also says that some of its rules for teens are more restrictive than what teens might see in a PG-13 movie. For example, the app aims to prevent teens from seeing any kind of "sexually suggestive" content or images of "near nudity" even though that type of content might appear in movies rated for 13-year-olds.
For parents that want even tighter restrictions, Instagram is also adding a new "limited content" setting that filters "even more" content from teens' view (Meta didn't explain what exactly would be restricted). The setting also prevents teens from accessing any comments on the platform, either on their own posts or other users'. Finally, Meta is testing a new reporting feature for parents that use Instagram's parental control settings to monitor their teens' use of the app. With the feature, parents can flag specific posts they feel are inappropriate to trigger a review by Meta.
Meta says the latest changes will be rolling out "gradually" to teen accounts in the US, UK, Canada and Australia to start and that it will eventually "add additional age-appropriate content protections for teens on Facebook."
This article originally appeared on Engadget at https://www.engadget.com/social-media/instagram-makes-teen-accounts-more-restrictive-120000653.html?src=rss