Instagram’s teen account settings will now be guided by the PG-13 guidelines followed for movies, according to Meta — but whether or not they’ll be effective is another question, according to tech experts.
Teen account users have so far been barred from seeing “sensitive content” including posts that are sexually suggestive or show physical violence, for example. These new rules will go a step further, and will avoid recommending posts that contain strong language or risky stunts as well as anything else that “could encourage potentially harmful behaviours,” Meta announced Tuesday.
The new rules will also bar teenage users from following accounts that consistently post age-inappropriate content and seeing search results generated by certain words, like “gore” or “alcohol.” Teens will also be barred from opening links via Instagram that go against the platform’s updated guidelines.
“Just like you might see some suggestive content or hear some strong language in a PG-13 movie, teens may occasionally see something like that on Instagram — but we’re going to keep doing all we can to keep those instances as rare as possible,” the company said in a blog post. “We recognize no system is perfect, and we’re committed to improving over time.”
Meta says it made the changes in an effort to align its own rules with an “independent standard” that parents are familiar with. The changes will begin rolling out to users in Canada, the U.S., U.K. and Australia starting Tuesday, and Meta says the rollout will be complete by the end of 2025.
Instagram is rolling out new teen accounts with enhanced parental controls and privacy features, but some parents say Meta still needs to do more to make the platform safe for young users.
Teen accounts, which were first announced in September 2024 and apply to all users under 18 years of age, are already set to private as a default (meaning other users have to request to follow those accounts and see their content), can’t receive DMs from strangers and have notifications silenced at night.
Meta has periodically upped the restrictions on teen accounts since then, like limiting the ability for kids to livestream or unblur nudity, though the company says this is the most significant rule change since the teen account function was first announced.
Teen accounts can’t fix everything
The added controls are “a little bit, very late,” according to Richard Lachman, a professor in the RTA School of Media at Toronto Metropolitan University.
“We are seeing some movement from the major platforms to introduce things that frankly they could have been doing a decade ago,” Lachman said. Regulating content by filtering out certain keywords, for example, isn’t very advanced, according to Lachman.
And while the new rules will apply to any users who input a birthday that identifies them as being younger than 18, Lachman says it’s not hard for kids to find a way around the controls altogether.
While users could simply put an age that’s above 18 in when they make accounts to circumvent the rules, Meta says they will use age prediction technology in order to try and catch teens using adult accounts.
Meta uses AI technology to do this in part, by analyzing the date an account was created and how an account interacts with content on the platform, before placing content restrictions on those accounts it suspects to be teens in disguise. Meta also says it requires teens to verify their identity with a “video selfie or ID check” if they try to change their birth date from an age under 18 to one over 18 years old.
But technology analyst Carmi Levy says how Meta is predicting a user’s age, and how age-inappropriate content is screened out, is still quite opaque.
As Quebec considers limiting social media for kids under 14, we look at how other jurisdictions around the world have enforced their own restrictions and the concerns when it comes to protecting the privacy of minors.
“We haven’t seen the fulsome data from Meta that would give us a better picture of exactly how these tools are working or if they’re working at all,” Levy said. “So we’re gonna have to [take] them at their word.”
Research has shown that the teen account settings in place for the past year haven’t always been successful at filtering out unsafe posts, including eating disorder content. One recent study by the Heat Initiative, which advocates for children’s safety online, found that nearly half of teenage Instagram users had encountered unsafe content (including things like hate speech or alcohol/drug use content) or unwanted messages in the previous month.
When asked about this kind of research, a spokesperson for Meta told CBC News that they acknowledge no system is perfect. They added that parents will also be able to give the company feedback about what kinds of content should be limited to teens to help improve its moderation.
Parents still need to be vigilant
Levy says the new restrictions in and of themselves will help a bit, but he worries that announcements like this one might lull parents into a false sense of security.
“Parents assume that platforms like Meta have this problem in hand, that they’re taking care of it, and then they kind of ease back on their efforts to ensure that their kids are safe in the digital environment,” Levy said.
“It’s still ultimately the family’s responsibility — mom, dad, caregivers, kids and the community that surrounds them — to ensure that their kids are raised as good digital citizens.”
As a mom of two teenagers — a 14-year-old and a 17-year-old — Katherine Korakakis says the change is welcome. “Anything we can do to equip parents with the tools they need to have … a safer experience for their children online, I think is [a] good step forward,” Korakakis said.

But while her kids use Instagram teen accounts, Korakakis isn’t reliant on the platform’s guardrails. She says she also personally monitors their social media use and retains parental controls that can limit what sites her kids can access on their devices and sets time limits for individual platforms.
Korakakis adds she thinks the only real solution is to empower parents with the information to control their children’s own social media use.
Levy also says it’s important for parents to talk with their kids about the risks of social media, because restricting kids from technology will only force them to find a way around the rules.
“Have those conversations so they know they can come to you in a trusting environment, no judgment,” Levy said. “This is a partnership-based approach, not an adversarial one.”

