Australia’s proposed ban on social media use by under-16s has sparked national debate over how the government intends to enforce it and which platforms will fall within the law’s scope, as regulators attempt to address rising concerns over youth mental health and online harm.
The policy and its rationale
The federal government has announced plans to introduce strict age verification requirements for social media accounts, aiming to prevent anyone under 16 from accessing platforms such as Instagram, TikTok, and Snapchat. The proposal follows mounting pressure from parents, educators, and health professionals who cite growing evidence linking social media use to anxiety, depression, and body image issues among adolescents.
Communications Minister Michelle Rowland said the government is working to establish a “world-leading verification system” in consultation with privacy experts and digital safety regulators. The system is expected to require a combination of government-issued ID checks and facial recognition tools to validate a user’s age before granting access.
Challenges of enforcement
Despite broad support for increased protections, critics question the practical and ethical feasibility of enforcing such a ban. Technology experts warn that most young users already bypass age limits using fake dates of birth, while others raise concerns over storing sensitive identification data with private companies.
The eSafety Commissioner has acknowledged these difficulties, suggesting enforcement may initially rely on platform cooperation rather than government surveillance. However, non-compliant firms could face stiff penalties, including fines or being blocked from operating in Australia altogether.
Civil liberties groups and digital rights advocates also fear that mandatory ID checks could lead to broader surveillance practices and threaten online anonymity, particularly for vulnerable groups.
Which platforms will be affected?
Mainstream platforms like Meta’s Facebook and Instagram, ByteDance’s TikTok, and Snap Inc.’s Snapchat are likely to be directly affected, given their popularity with teenagers and existing histories of non-compliance with youth safeguards.
However, the government has indicated that messaging-only services such as WhatsApp and Telegram — where users communicate directly without publicly visible profiles — may be exempt, at least in early iterations of the legislation. Platforms geared primarily toward gaming or educational purposes, such as Roblox or Duolingo, could also be treated differently depending on their core functionality and audience.
YouTube remains a grey area, as it combines video content, algorithmic recommendations, and social features. Officials say the final list of regulated services will be defined through a combination of public consultation and regulatory assessment.
Global trend, local test case
Australia’s move comes amid a broader international reckoning over how to protect children online. The European Union has strengthened its Digital Services Act, while several US states have introduced their own underage social media restrictions. Australia’s approach may serve as a regional model, but success will depend on building public trust in both the policy’s intent and its execution.
Ultimately, the question is not just whether the law can be enforced — but whether it will work as intended without overreaching into citizens’ digital rights.
REFH – Newshub, 31 July 2025
Recent Comments