Don't pay so you can read it. Pay so everyone can!

Don't pay so you can read it.
Pay so everyone can!

Social media age ban is upon us. What does it really mean?

by Daniel Angus | Dec 8, 2025 | Comment & Analysis, Latest Posts

The much heralded and little understood social media ban for kids comes into effect on Wednesday, but will it solve anything? QUT Professor Daniel Angus asks.

Young people in Australia are on the verge of a profound shift in their digital lives. And while public debate on the ban has focused on parenting choices, the real issue is corporate compliance, technical design, and safe spaces for young people.

The Online Safety Amendment (Social Media Minimum Age) Act 2024 has been sold by the Australian government as a straightforward fix for a range of complex problems. The policy seeks to prevent under-16s from holding social media accounts on designated platforms.

Supporters of Australia’s social media age ban, many with strong links to traditional media organisations, argue it will protect children from online harm, reduce bullying, and limit exposure to addictive design. Yet there is limited research that a ban will achieve these lofty ambitions, with many instead citing concerns about exclusion, migration to unregulated platforms, and the privacy risks created by mandatory age-verification.

Public debate on the subject has been loud and emotive. Yet key facts about how the law works have been missing. It is important to understand what the ban actually does, what it does not do, and what risks may follow.

Who does the law apply to?

The law does not ban children from being on social media. It does not make it illegal for a young person to have an account. It does not make parents into criminals if they help their children stay online. The law applies only to social media platforms that the Minister for Communications has designated.

The duty sits with the platform, not with families.

This point matters. Many public comments have suggested that parents who help teens remain online are doing something deviant. They are not. They are acting in the same grey zone that families have always navigated around digital life. The law is a compliance burden on platforms, not a criminal offence for users.

In practice, platforms must take reasonable steps to prevent anyone under 16 years of age from holding an account. They are expected to use a range of new, possibly invasive and inaccurate, identity checks. It also asks them to prove that they are making a real effort. We do not know how the system will treat young or older people who do not have formal identity documents in practice, although the law prohibits these being the sole mechanism to prove age.

Ban on logging in, not access

While public debate has focused on parenting choices, the real issue is corporate compliance and technical design.

The law is written around the idea that any harm young people face only happens when they are logged in. This reflects a belief that the logged-in state shapes how content reaches teens.

It is true that recommendation systems target users when they are signed in. It is also true that accounts shape what teens see and who they interact with. But most platforms can still be accessed in a logged-out state (e.g. TikTok and YouTube). The ban does not stop teens from visiting these sites; it only stops them from having personal accounts that follow them from device to device.

This creates an odd gap. After the ban, a teen may still scroll a feed, watch videos or search for content without logging in. Their experience may be less tailored. It may also be more random and possibly more risky. Logged-out access can surface a wide mix of material. A personalised feed can filter some unwanted content. Removing accounts, thus, does not remove exposure.

But we are not talking enough about this difference.

Young people who relied on personal feeds to guide them to trusted voices may now see a flood of unrelated or even harmful material. Young people who had private communities may now be pushed into a broad search environment. The policy focuses on removing personalised attention systems.

It does not give teens a safe public space in return.

Differing impacts for young people

No two young persons use the internet in the same way. Though social media can bring real harm, it can also bring life-saving community support and care.

The risk is highest for young people who already live in precarious situations. Remote youth have long used digital platforms to find peers and support outside their town. Indigenous youth have created rich spaces for culture and connection. Young people with diverse sexuality or gender identities often rely on online communities to feel safe and understood. Many do not have local services that can offer the same support.

For these groups the ban may take away a key part of daily life. It may create silence where there was once community. It may drive them toward spaces that are harder to monitor. It may push them into commercial platforms that are not declared prohibited yet still carry risks.

Thus, this ban will only land unevenly. Some young people will stop using certain apps. Others will find workarounds. Some will shift further into private messaging systems. Others will be cut off from peers and shared interests.

If the aim is to protect young people from harm, we must plan for what comes next. The law is in place. Now the work begins to build alternatives that let young people reconnect. That means new sources of support and new public spaces online. It means services that are shaped with young people, not imposed on them.

Does not solve the main problems

The government has presented the ban as a strong response to bullying, mental health concerns, and exposure to harmful content. Yet none of these problems are solved by removing accounts. Bullying is a social problem that happens in schools, homes, and communities. It continues across many channels. It can move to messaging apps, gaming platforms, or group chats.

A ban on accounts on certain platforms does not address the cause.

Mental health is more complex than a sign-in screen. Digital life can worsen stress. It can also give help, peer support, and access to services.

The ban does not provide new mental health programs. It does not help families talk about safe use. It does not train teachers or create trust with young people. Instead it changes one feature of the digital experience and declares success before any evidence has been collected.

PM brands social media ban a success before it starts

It is concerning that the government is already framing this as a major win. The ban has not yet taken effect. We have not seen how teens will respond. We have not seen how companies will adjust their design to meet the law.

Declaring victory now risks letting real problems remain in the background.

Protecting rights and participation

Young people have always been part of public life. Digital platforms have given them ways to learn, share ideas, and organise movements. The school strike for climate is a clear example. This movement was shaped online by young people before they were old enough to vote. The age ban removes many tools they used to speak as a group. Their right to political participation remains a core part of the Convention on the Rights of the Child.

Policy should not silence these voices or treat them as a threat.

When the ban begins, we must find ways to restore the social role of young people. We need education that supports critical use of media. We need safe spaces for civic participation. We need to listen when young people tell us how they use digital tools and why.

It is easy to pass a law. It is much harder to build a strong support system and culture of care around young people.

Originally published under Creative Commons by 360info™.

Elon Musk vs Australia: will global content take-down orders do more harm than good?

Daniel Angus

Prof Daniel Angus is a Chief Investigator at the Queensland University of Technology node of the ARC Centre of Excellence for Automated Decision-Making & Society (ADM+S), and a Professor of Digital Communication in the School of Communication.

Don't pay so you can read it. Pay so everyone can!

Don't pay so you can read it.
Pay so everyone can!

Pin It on Pinterest

Share This