Two months after Australia’s landmark ban on social media use by children under 16 took effect, Snapchat says it has disabled hundreds of thousands of accounts but warns the law’s current design risks leaving young users less protected, not more.
In a blog post published this week, **Snapchat** said it had locked or disabled more than 415,000 Australian accounts by the end of January that either self-declared an age under 16 or were identified as underage through the company’s age-detection tools. The company said it continues to disable additional accounts daily as it works to comply with the Social Media Minimum Age law.
“Two months into the implementation of Australia’s Social Media Minimum Age (SMMA) law, Snapchat remains fully committed to complying with the legislation and supporting its underlying goal of improving online safety for young Australians,” the company said.
But the company said early implementation had highlighted what it described as “important insights about the potential limitations of this law as it currently stands,” pointing first to the difficulty of accurate age verification. It cited the Australian government’s own age assurance trial, published in 2025, which found age estimation technology was typically accurate only within a two-to-three-year range.
“In practice, this means some young people under 16 may be able to bypass protections, potentially leaving them with reduced safeguards, while others over 16 may incorrectly lose access,” the company said.
Snapchat also warned that the law’s limited scope could push teens toward less regulated platforms. “Young people won’t stop communicating when they lose access to regulated services,” the company said, noting that more than 75% of time spent on Snapchat in Australia is messaging with close friends and family. While it said there is no data yet to quantify migration to alternative apps, Snapchat argued the risk “deserves serious consideration.”
As a remedy, the company renewed its call for age verification at the app-store level rather than at the individual app level. “App store-level age verification would help address multiple risks and gaps,” the company said, arguing it would provide more consistent age signals, reduce wrongful lockouts of older teens and apply protections more broadly across the digital ecosystem.
Despite complying with the law, Snapchat reiterated its opposition to a blanket ban. “We still don’t believe an outright ban for those under 16 is the right approach,” the company said, adding it “fundamentally disagree[s] that Snapchat is an in-scope age-restricted social media platform.”
Nor is it alone in this view.
Earlier this year, Meta put out a similar statement saying, “The premise of the law, which prevents under 16 year-olds from holding a social media account so they aren’t exposed to an “algorithmic experience” is false. Platforms that allow teens to still use them in a logged-out state still use algorithms to determine content the user may be interested in – albeit in a less personalised way that can be appropriately tailored to a person’s age.
It described this as the rationale for its Teen Accounts that it claims “protect teens and ensure parents have the tools they need to oversee their families online. Teen Accounts provide built-in protections so teens can enjoy everything they love about our apps – a place to make friends, find community, learn new skills, and express themselves – with the right safeguards in place.”
However, according to the social media giant, “The social media ban restricts teens from these benefits, and will result in inconsistent protections across the many apps they use, including those they are required to use in a logged-out state without the safeguards provided to registered users.”
In comments to the media earlier this year, Minister for Communications, Anika Wells, said, “More than 4.7 million under-16 social media accounts being deactivated because of our world-first social media minimum age law is a huge achievement.”
According to Wells, “While it’s early, every account deactivated could mean one extra young person with more free time to build their community and identity offline.”
She accepted, however, that more work needed to be done. “[The] eSafety Commissioner is looking closely at this data to determine what it shows in terms of individual platforms’ compliance.”
“We’ve said from the beginning that we weren’t expecting perfection straight away – but early figures are showing this law is making a real, meaningful difference.”
