Home / News / Meta gives Australian kids 2-week warning to delete accounts as world-first social media age restrictions loom.

Meta gives Australian kids 2-week warning to delete accounts as world-first social media age restrictions loom.

Meta gives Australian kids 2-week warning to delete accounts as world-first social media age restrictions loom.

Melbourne, Australia — In a groundbreaking move poised to redefine online engagement for minors globally, technology giant Meta on Thursday initiated a two-week countdown for thousands of young Australians, urging them to download their digital histories and delete their accounts from Facebook, Instagram, and Threads. This unprecedented action comes in response to Australia’s world-first social media ban, which will prohibit accounts for children younger than 16, taking effect on December 10, 2025.

The Australian government, a fortnight prior, announced its pioneering legislation, mandating that major platforms including Meta’s trio (Facebook, Instagram, Threads), Snapchat, TikTok, X (formerly Twitter), and YouTube, implement "reasonable steps" to exclude Australian account holders under the age of 16. This bold regulatory push aims to safeguard children from the potential harms associated with prolonged social media exposure, ranging from cyberbullying and exposure to inappropriate content to negative impacts on mental health and development.

Meta gives Australian kids 2-week warning to delete accounts as world-first social media age restrictions loom.

Meta, the California-based tech behemoth, has become the first among the targeted companies to publicly outline its compliance strategy. Starting Thursday, the company began sending SMS messages and emails to thousands of suspected underage account holders, notifying them that access to their platforms would be denied from December 4. This swift deadline provides a critical, albeit short, window for young users to preserve their digital memories and contacts before their online presence on these platforms is effectively erased.

"We will start notifying impacted teens today to give them the opportunity to save their contacts and memories," Meta affirmed in a statement, emphasizing its commitment to facilitating a smooth transition despite the restrictive nature of the new law. The company also indicated that young users could utilize this notice period to update their contact information, which would allow Meta to "get in touch and help them regain access once they turn 16." This provision offers a glimmer of hope for those who wish to return to the platforms in the future, maintaining a digital tether for eventual reconnection.

The scale of impact is substantial, particularly for Meta’s platforms. The company estimates that approximately 350,000 Australians aged 13 to 15 are active on Instagram, with an additional 150,000 in the same age bracket using Facebook. These figures represent a significant portion of Australia’s youth population within a nation of 28 million, underscoring the widespread implications of the new legislation. The sudden severance of these digital ties is expected to create a ripple effect, impacting social connections, access to information, and daily routines for hundreds of thousands of teenagers.

A key challenge highlighted by the new law is the complex issue of age verification. Meta has established a process for account holders aged 16 and older who may have been mistakenly identified as underage. These individuals can appeal the decision by contacting Yoti Age Verification, a third-party service, and providing government-issued identity documents or a "video selfie" to verify their age. This reliance on advanced verification methods, however, is not without its critics. Terry Flew, co-director of Sydney University’s Center for AI, Trust and Governance, voiced concerns about the accuracy of such technologies. "In the absence of a government-mandated ID system, we’re always looking at second-best solutions around these things," Flew told the Australian Broadcasting Corp., estimating a failure rate of at least 5% for facial-recognition technology. This raises questions about potential misidentification, privacy implications of submitting personal documents, and the digital exclusion of those unable or unwilling to comply with these stringent verification demands.

The Australian government has preemptively cautioned platforms against demanding that all account holders prove they are over 15, deeming such a measure an "unreasonable response" to the new age restrictions. Regulators argue that social media companies already possess sufficient data on many account holders to ascertain their age without imposing a universal verification burden. This stance underscores the government’s expectation that platforms leverage their existing data infrastructure to comply, rather than shifting the entire responsibility and privacy burden onto users. Non-compliance carries significant financial penalties, with social media companies facing fines of up to 50 million Australian dollars (approximately $33 million USD) if they are found to be failing to prevent individuals under 16 from creating or maintaining accounts on their platforms.

Antigone Davis, Meta’s vice president and global head of safety, expressed a preference for a more integrated age verification system. She advocated for app stores, such as Apple App Store and Google Play, to collect age information during the initial sign-up process and verify users are at least 16 years old before granting access to apps like Facebook and Instagram. "We believe a better approach is required: a standard, more accurate, and privacy-preserving system, such as OS/app store-level age verification," Davis stated. She added that this approach, combined with Meta’s ongoing investments in age assurance, would offer "a more comprehensive protection for young people online," suggesting a systemic solution would be more effective and less intrusive than piecemeal platform-specific efforts.

The legislation has garnered strong support from parent advocacy groups. Dany Elachi, founder of the Heaps Up Alliance, a group that actively lobbied for the social media age restriction, urged parents to proactively engage with their children in planning how they will utilize the hours previously consumed by social media. "The principle that children under the age of 16 are better off in the real world, that’s something we advocated for and are in favor of," Elachi remarked. He acknowledged some aspects of the legislation were not entirely supported but emphasized the core benefit. "When everybody misses out, nobody misses out. That’s the theory. Certainly we expect that it would play out that way. We hope parents are going to be very positive about this and try to help their children see all the potential possibilities that are now open to them." Elachi also criticized the government for only announcing the complete list of age-restricted platforms on November 5, leaving a short lead time for families and platforms to prepare.

Despite the enthusiasm from some quarters, the legislation faced significant resistance during its passage last year, including from several children’s advocacy groups and human rights organizations. Mat Tinkler, CEO of the Save the Children charity, issued a statement a year ago, when the ban was initially approved by Australian lawmakers, welcoming the government’s intent to protect children but questioning the chosen method. Tinkler argued that the solution should focus on regulating social media companies themselves, rather than imposing a blanket ban on children. He urged the government to "instead use the momentum of this moment to hold the social media giants to account, to demand that they embed safety into their platforms rather than adding it as an afterthought, and to work closely with experts and children and young people themselves to make online spaces safer, as opposed to off-limits."

Similarly, the Australian Human Rights Commission, an independent government body, expressed "serious reservations" prior to the law’s approval. The commission contended that "less restrictive alternatives available that could achieve the aim of protecting children and young people from online harms, but without having such a significant negative impact on other human rights." As an example, they suggested placing a legal "duty of care" on social media companies, which would compel platforms to actively design and operate their services in a manner that prioritizes the safety and well-being of young users, rather than simply excluding them.

As the December 10 deadline approaches, the implications of Australia’s pioneering law are being watched closely by governments and tech companies worldwide. This legislation sets a precedent, marking a significant escalation in the global effort to regulate social media’s impact on youth. While proponents foresee a healthier, more engaged generation of young Australians, critics worry about the practicalities of enforcement, potential privacy infringements, and the risk of driving children towards less regulated, ‘darker’ corners of the internet in their quest for digital connection. The coming weeks will not only test Meta’s compliance mechanisms but also illuminate the broader societal and technological challenges inherent in balancing online safety with fundamental freedoms and the evolving landscape of digital childhood.

Meta gives Australian kids 2-week warning to delete accounts as world-first social media age restrictions loom.

Leave a Reply

Your email address will not be published. Required fields are marked *