Meta is tightening the rules for its teenage users on Instagram, announcing on Tuesday that it will now limit the content they see to what you would typically find in a PG-13 movie. The move comes after years of intense criticism over the company’s handling of child safety and the mental health of its young users.
With the new guidelines, Meta will start hiding certain accounts from teenagers, including those that share sexualized content or posts about drugs and alcohol. Teens will also no longer be recommended posts that contain swear words, though they will still be able to search for them.
In a media briefing, Meta executives said their previous rules were already in line with PG-13 standards, but that many parents were confused about what their kids were actually seeing on the platform. To make things clearer, the company officially decided to align its policies with the familiar movie rating system. “We made these changes so teens’ experience in the 13+ setting feels closer to the Instagram equivalent of watching a PG-13 movie,” the company said in a blog post.
The social media giant has been under fire for years, especially after a 2021 report from The Wall Street Journal revealed how harmful Instagram can be for teenage girls. Other reports have shown how easily teens can use the platform to find drugs. In response, Meta has rolled out some new safety tools and features for parents.
The new content guidelines will start rolling out today in the U.S., UK, Australia, and Canada, with other countries to follow. It’s a clear attempt by Meta to show it is taking child safety seriously, but it remains to be seen if these new rules will be enough to satisfy its many critics.