Australia to Ban Under-16s From Meta Platforms as Company Begins Shutting Down Teen Accounts
Meta has begun warning young users in Australia that their Facebook, Instagram and Messenger accounts will soon be disabled as the country prepares to enforce one of the world’s strictest age-based social media rules. The move comes ahead of a new Australian mandate that will bar anyone under 16 from using Meta platforms, a policy set to take effect on December 10.
Teenagers across the country started receiving in-app notifications this week advising them that their access will soon be revoked. Once the ban begins, accounts belonging to users identified as under 16 will be locked, cutting off their ability to log in, post or message. Meta confirmed that the shutdowns will be automatic and will apply even if a parent has previously approved the account.
Australia’s government says the policy is part of a broader effort to reduce online harms involving minors, including exposure to predators, targeted advertising, bullying and mental health risks linked to social media use. The ban mirrors a growing global shift, with countries seeking tighter control over how tech companies handle young users, although Australia’s age cutoff is among the most aggressive to date.
Meta said it is complying with the law but has expressed concerns about how the rule will work in practice. The company has long argued that mandatory government age verification could push teens toward unsafe or unregulated online spaces. The new requirement forces Meta to assess user ages more strictly and enforce removals even if teens attempt to bypass sign-up checks.
The upcoming shutdown has sparked debate among Australian lawmakers, parents, educators and digital rights experts. Supporters say the crackdown is overdue and will shield children from platforms that have been linked to anxiety, self-harm content and addictive behavior. Critics argue that blanket bans do little to address the underlying risks and instead remove teens from mainstream platforms that offer parental controls, reporting systems and safety teams.
Meta’s notification system will continue over the next two weeks as the company works to identify accounts belonging to under-16 users. Once locked out, teens will be unable to appeal unless they can provide government-verified proof of age confirming they are 16 or older. Meta says adults who created accounts when they were younger but are now past the age threshold will have their access restored after verification.
The shutdown raises questions about enforcement and accuracy. Age detection on social platforms relies on a mix of signals, including reported date of birth, behavior patterns, and in some cases, AI-based prediction models. While Meta says it is improving these systems, it acknowledges there will be mistakes. Civil society groups warn that some teens could be incorrectly flagged, while others may slip through the system entirely.
The ban also highlights a broader global trend of governments taking a harder line on Big Tech. Countries including the United States, the United Kingdom and members of the European Union have been exploring stricter rules for minors, from parental consent requirements to age verification mandates. Australia’s approach stands out because it simply blocks young users outright rather than regulating what they can access within the platform.
Schools across Australia are already preparing for the shift, with some educators saying the ban could reduce classroom distractions and social conflict. However, others worry that removing teens from major communication platforms could create new challenges, including isolation or increased use of encrypted apps that have fewer safety controls.
For Meta, the decision underscores the pressure it faces as governments accelerate regulatory crackdowns. The company is also dealing with major policy shifts in Europe under the Digital Services Act, continued scrutiny from U.S. policymakers and growing litigation linked to youth mental health. The Australian ban adds to a long list of operational adjustments Meta must make across different regions.
The company has said it will continue investing in age detection, content moderation and teen safety tools globally. These include nudging teens toward private accounts, restricting messaging from adults they do not follow and filtering potentially harmful content. But in Australia, those tools will effectively become irrelevant for users under 16 once the ban fully takes effect.
Parents across the country are split on the changes. Some welcomed the clear cutoff and say it removes the burden of negotiating screen time and safety settings. Others argue that the policy is too heavy handed and fails to give families flexibility to supervise their children online. Several digital rights organisations have also warned that the law could set a precedent for more intrusive identity checks on the internet, raising privacy concerns.
As December 10 approaches, the key unknown is how the ban will function at scale and whether it will meaningfully reduce online risks. Experts say the success of the law will depend not only on enforcement but also on whether teens migrate en masse to other platforms that fall outside Australia’s regulatory reach.
For now, Meta’s message to young users is clear. Accounts belonging to Australians under 16 will soon disappear from Facebook, Instagram and Messenger. The company says it will continue coordinating with Australian regulators but maintains that the long term impact of the ban will only become clear once it is fully implemented.

Comments
Post a Comment