The Online Safety Bill, if passed in its current form, could further undermine political accountability by ensuring footage of police violence or human rights abuses, for example, is taken down. That the government is not listening to concerns about the bill’s wide powers suggests some of the consequences may be intended. Samantha Floreani reports.
The Online Safety Bill is made up of several schemes, some of which are appropriate ways to handle online harm. The schemes covering cyber-bullying, cyber-abuse and image-based abuse provide powers to help remove material that is harmful to children and seriously harmful to adults. Intimate images shared without consent can also be taken down. Notably, these powers are reactive: they respond to complaints made by those harmed.
The bill also introduces “basic online safety expectations”, which allow the eSafety Commissioner to introduce industry standards and technical requirements. The bill includes an online content scheme and a scheme tackling abhorrent violent material, providing the power to block and take down content deemed offensive or violent.
These powers are generally proactive. The eSafety Commissioner will have a mandate to search the internet for content that is covered by the bill. Essentially, the eSafety Commissioner will have the power to arbitrate what content Australians are able to access.
While the powers are broad, so too is the scope of the content and services to which they apply. The Commissioner can order the removal of content categorised as Class 1 or Class 2 material, which, at its lowest, corresponds to anything deemed R18+ or above in the National Classification Code. This captures any sexual content, violent or not, or content that is “unsuitable for a minor to see”.
These provisions apply to social media platforms as well as “relevant electronic” and “designated internet” services, including email, SMS, instant messaging, online gaming or any service that allows users to access material using the internet.
Taken together, the bill covers sexual content that many Australians consensually engage with, including in their personal correspondence. This is clearly an overstep.
Much of the debate has focused on the harm caused to children when they access material considered to be “offensive” but little consideration appears to have been given to the potential harm of not allowing access. For instance, many LGBTQ+ youth rely on the Internet and pornography to counter the lack of inclusive sex education.
Freedom of expression censored
Children’s safety is important but it should not be politicised to pass overbroad legislation without checks and balances.
The bill would also likely cover content such as footage of human rights abuses or violence by police that could be used to ensure political accountability. The current public interest exemption does not go far enough to ensure that Australians will not be prevented from seeing material holding those in power accountable.
Civil society groups have expressed concerns that the powers of the eSafety Commissioner could be used to repress freedom of expression and censor sex workers. Current Commissioner Julie Inman-Grant has dismissed such concerns as “ill-founded”, saying that the sex industry is “not her concern”.
Yet the explanatory memorandum contradicts this, describing the intention to develop a “comprehensive roadmap” for the regulation of pornography. Internationally, swathes of sex workers’ support, safety and advocacy groups have been taken off digital platforms as a result of comparable US legislation.
Blanket censorship and over-compliance is a likely consequence of the bill. The measures may give platforms such as Facebook and others incentives to pre-emptively remove content rather than risk penalty.
Another issue is how the powers may extend to encrypted services. The bill leaves the door open to giving the Commission access to communications, including some that are currently end-to-end encrypted. Inman-Grant has even argued against end-to-end encryption in general, claiming that it facilitates online child sexual abuse.
Regressive surveillance agenda
Such claims promote a regressive surveillance agenda at the expense of our digital security. Robust encryption is essential for the digital security of individuals and governments alike.
If we’re genuinely concerned about child safety online, consider how encryption protects them from predatory users tapping into their webcam or their communications with friends and family.
Without any amendments, the bill could be used to compel providers to weaken encryption – a sure-fire way to undermine online safety for all of us. Children included.
Public submissions ignored
The consultation process has been rushed, suggesting that the government is not meaningfully engaging with public concerns. Some 376 submissions were made to the public consultation, yet just 10 days later the bill was tabled in parliament with no meaningful amendments.
Then, despite only being provided three working days’ notice, another 135 submissions were made to the Senate inquiry. Again, no meaningful recommendations or amendments were made.
The proposed law gives an alarming amount of discretionary power to one official to determine what adult Australians can access online. The lack of meaningful consultation suggests that the government is not interested in having nuanced debate about complex issues of personal autonomy, individual responsibility and online harm reduction. .
If the Morrison government continues to ignore public concerns, including the call for transparency and accountability, then perhaps these consequences are not so unintended after all.
Sam is a campaign office at Digital Rights Watch working at the intersection of feminism, human rights and technology. As former Program Director for Code Like a Girl, Sam is dedicated to ethics of technology in all its forms- from gender equity in the tech industry to upholding privacy in an increasingly surveillance-obsessed world.