Tech giant Meta has publicly called on the Australian government to reconsider its pioneering law banning children under 16 from social media platforms. This appeal comes as the company revealed it had blocked more than 544,000 accounts in a single week to comply with the controversial legislation.
Mass Account Removal Under New Law
Australia's groundbreaking law, which came into effect on December 10 last year, mandates major platforms like Meta, TikTok, and YouTube to prevent underage users from holding accounts. The stakes are high for non-compliance, with companies facing potential fines of up to Aus$49.5 million (US$33 million) if they fail to take reasonable steps to enforce the ban.
In the week leading up to December 11, Meta reported a massive enforcement action. The company, led by billionaire Mark Zuckerberg, removed 331,000 underage accounts from Instagram, 173,000 from Facebook, and 40,000 from Threads. While affirming its commitment to following the law, Meta expressed significant concerns about its broader impact.
Meta's Call for a Different Approach
In an official statement, Meta argued for a more collaborative and nuanced strategy. The company urged the Australian government to work with the industry to find a better path forward. Instead of what it termed "blanket bans," Meta suggested incentivizing platforms to raise standards for providing safe, privacy-preserving, and age-appropriate online experiences.
Meta reiterated a previous proposal, placing a key responsibility on app stores. The company advocates for a system where app stores are required to verify a user's age and obtain parental consent before anyone under 16 can download an app. Meta warned that without such a foundational check, the industry is stuck in a futile "whack-a-mole" race, where teens simply migrate to new, less-regulated apps to circumvent the ban.
Government Stands Firm, Highlights Data Responsibility
The Australian government has firmly defended the law. A government spokesperson stated that the legislation is about holding social media companies accountable for the harm they cause young Australians. The spokesperson pointed out that platforms like Meta collect vast amounts of user data for commercial purposes and argued they can and must use that information to ensure under-16s are not on their platforms.
Meta countered by highlighting concerns from parents and experts. The company warned that the ban could isolate young people from supportive online communities and potentially push them toward less regulated apps and darker corners of the internet. Meta stated that initial impacts suggest the law is not meeting its objectives of increasing the safety and well-being of young Australians.
Addressing the complex challenge of age verification online, Meta described its compliance as a "multilayered process." Since the ban took effect, the California-based firm has helped establish the OpenAge Initiative, a non-profit group. This initiative has launched age-verification tools named AgeKeys, designed for use with participating platforms, signaling a search for a more universal technical solution.