News

THE ONLINE REGULATION SERIES | AUSTRALIA (update)

Written by Adam Southey | Nov 18, 2021 1:04:05 PM

NOVEMBER 2021 UPDATE TO AUSTRALIA’S REGULATORY FRAMEWORK

Since publication of our previous analysis of Australia’s regulatory framework in the Online Regulation Handbook, published in July 2021, Australia has passed the Telecommunications Legislation Amendment (International Production Orders) Act 2020, the Surveillance Legislation Amendment (Identify and Disrupt) Act, released an exposure draft[mfn]This type of draft bill is also frequently known as a “green paper” or a “consultation draft”.[/mfn] of the Privacy Legislation Amendment (Enhancing Online Privacy and Other Measures) Bill 2021, and introduced a private member's bill in the Parliament of Australia to make social media companies liable for defamatory content posted on their platforms.

  • The Online Safety Act, which was passed in June 2021, will come into effect from 1 January 2022. This Act sets out to reform and expand existing online safety regulations. It does so, by introducing five schemes to deal with different types of harmful online material. Please see below for analysis of this Act from the entry in the Online Regulation Handbook.

  • On 25 October 2021, Australia’s Attorney-General, Michaelia Cash, released an exposure draft of the Privacy Legislation Amendment (Enhancing Online Privacy and Other Measures) Bill 2021, which would enable the creation of a binding Online Privacy code for social media services, data brokers and other large online platforms operating in Australia.
    • Under this proposal, online platforms subject to the code would need to comply with strict new privacy requirements, including stronger protections for children on social media. For example, under the code, social media platforms will be required to take all reasonable steps to verify their users' age, and give primary consideration to the best interests of the child when handling children's personal information.[mfn]Read more on this here: Rodriguez, Schoen (2020), 5 Serious Flaws in the New Brazilian “Fake News” Bill that Will Undermine Human Rights [UPDATED], Electronic Frontier Foundation (EFF).[/mfn] The code will also require platforms to obtain parental consent for users under the age of 16.

    • The Online Privacy Bill will also toughen the penalties and enforcement powers available to Australia's privacy regulator, the Office of the Australian Information Commissioner.[mfn]See: Landmark privacy reforms to better protect Australians online, Attorney-General for Australia and Minister for Industrial Relations.[/mfn]

  • In August 2021, the Australian government passed the Surveillance Legislation Amendment (Identify and Disrupt) Act, which greatly extends Australia’s surveillance capabilities.[mfn]Freedom on the Net 2021: Australia Freedom House[/mfn]
    • The Act grants the Australian Federal Police (AFP) and Australian Criminal Intelligence Commission (ACIC) the ability to request new types of warrants to investigate and disrupt “serious” crime.[mfn]Freedom on the Net 2021: Australia Freedom House[/mfn] The offences in question must be 'serious' in nature and carry a maximum sentence of three or more years’ imprisonment.[mfn]Ellison Minter (2021), How might the new Identify and Disrupt laws impact you?, Lexology[/mfn] To this end, the law creates three new classes of warrants for which the AFP and the ACIC may apply when conducting investigations of online activity. The new warrants - which are granted where a law enforcement officer reasonably suspects that one or more relevant offences are being, about to be, or likely to be committed - include: [mfn]Ellison Minter (2021), How might the new Identify and Disrupt laws impact you?, Lexology[/mfn]
      • Data disruption warrants, which grant access to data held on a computer(s) in order to undertake 'disruption activities' to frustrate the commission of criminal activity;

      • Network activity warrants, which enable the collection of intelligence on serious criminal activity being conducted by criminal networks operating online;

      • Account takeover warrants, which enable a person's online account to be taken over to gather evidence of criminal activity.

    • The warrants empower the AFP and the ACIC to - amongst other things - add to, copy, delete or alter other data in the target computer or account, and to intercept a communication passing over a telecommunications system, if the interception is necessary for the execution of the warrant.[mfn]Ellison Minter (2021), How might the new Identify and Disrupt laws impact you?, Lexology[/mfn] In addition to this, law enforcement officers may also take any reasonably necessary steps to conceal the fact that any action has been taken pursuant to the warrant.[mfn]Ellison Minter (2021), How might the new Identify and Disrupt laws impact you?, Lexology[/mfn]

    • Law enforcement officers executing one of the warrants can use a variety of technologies, including:
      • the target computer;

      • a telecommunications facility operated or provided by the Commonwealth or a carrier;

      • any other electronic equipment; or

    • The Act also enables law enforcement officers to apply to a Court for an order requiring a specified person to provide “any information or assistance” that is reasonable and necessary to allow the law enforcement officer to execute one of these new warrants. An exception to this, whereby a person is able to object to disclosing information, is made in the case of information which, if disclosed, could reasonably be expected to reveal details of data disruption, network activity or account takeover technologies or methods.[mfn]Ellison Minter (2021), How might the new Identify and Disrupt laws impact you?, Lexology[/mfn]

    • The new law makes an offence of the unauthorised use or disclosure of protected information obtained through the use of a warrant. The use or disclosure of protected information jeopardising the conduct of an investigation or endangering the health or safety of any person is punishable by ten years' imprisonment.[mfn]Ellison Minter (2021), How might the new Identify and Disrupt laws impact you?, Lexology[/mfn]

The Right to Privacy

The Surveillance Legislation Amendment (Identify and Disrupt) Act, passed in August 2021, greatly extends both Australia’s surveillance capabilities and the potential for infractions of users’ rights to privacy.[mfn]Ellison Minter (2021), How might the new Identify and Disrupt laws impact you?, Lexology[/mfn] This is particularly true of the three new warrants introduced in the Act empowering the AFP and the ACIC to add to, copy, delete, or alter other data in the target computer or account, as well as to intercept a communication passing over a telecommunications system. It is important that governments ensure that human rights, particularly the right to privacy, are upheld when countering terrorist and extremist use of the internet. We understand the need for governments to access communications for investigations, however, such access should not weaken security and privacy of users. According to the Human Rights Law Centre, there are concerns that such powers could be “used to monitor the online activities of journalists and whistle-blowers”.[mfn]Online surveillance bill a dangerous overreach, Human Rights Law Centre.[/mfn]

Tech Against Terrorism submitted written evidence to the inquiry into the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019, see here.

THE ONLINE REGULATION SERIES 2020 | AUSTRALIA

Please note: The below is a copy of the Australia entry in our July 2021 Online Regulation Series Handbook.

Amongst the different global key trends identified by Tech Against Terrorism, Australia follows:

  • Mandating short removal deadlines

  • Outsourcing legal adjudication to tech companies

  • Mandating transparency and accountability

The Broadcasting Services Amendment (Online Services) Act of 1999 has been regulating harmful and illegal online content in Australia since the late 1990s. The Act established the legislative framework for online content regulation in the country

Australia’s regulatory framework

  • Enhancing Online Safety Act 2015, prohibits the sharing of, amongst other things, threatening posts on social media, and creates a “complaint and objection” system under the supervision of the newly established e-Safety Commissioner (2015).

  • The Online Safety Act, passed in June 2021 and which comes into effect from 1 January 2022, sets out to reform and expand existing online safety regulations. The Act introduces five schemes to deal with different types of harmful online material:
    • Four schemes already exist in law, but are being updated by the Act. These schemes include: cyber-bullying, image-based abuse, online content.

    • One is new: the adult cyber abuse scheme. The Act also includes new, shorter, takedown deadlines as well as industry codes.

  • The Online Safety Charter, outlines Australia’s expectations for online service providers to protect Australians from harmful online experiences.

  • The Taskforce to Combat Terrorist and Extreme Violent Material Online, produced a report on how the government and tech industry could improve their ability to prevent and respond to future online crisis events. As a result of the report’s recommendations, ISPs and the government have agreed to a new protocol to allow the blocking of websites hosting graphic material depicting a terrorist act or violent crime.

Relevant National Bodies

  • The e-Safety Commissioner is empowered under the Enhancing Online Safety Act 2015.
    • The Commissioner administers the Online Content Scheme, and can issue notices to service providers over content that violates the Criminal Code Amendment Act 2019.

    • The Commissioner can tell internet service providers (ISPs) to block access to material that exposes people in Australia to online terrorist and extreme violent material, but only during crisis events.

    • The Online Safety Act enables the Commissioner to utilise a new rapid website blocking power to block websites hosting abhorrent violent or terrorist material during an online crisis event, such as the Christchurch attack in 2019. The Act also requires search engines and app stores to remove access to a website or app that “systematically ignores” take down notices for class 1 material, such as child sexual abuse material.

    • The Commissioner produces annual reports on their performance, including on their assistance and investigations

Key takeaways for tech companies 

  • All internet content and service providers operating in Australia are to comply with the Online Content Scheme, which provides a legal basis for prohibited online content. 

  • Violation of the Criminal Code Amendment Act 2019–– can be sanctioned by:  
    • A fine of around $1.5mn[mfn]AU$2.1 mn[/mfn] or up to three years in prison (for an individual providing the content services or hosting services).  
    • A fine up to around $7.5mn[mfn]AU$10.5 mn[/mfn] or 10% of annual revenue for each offense (for a company). 

  • Examples of potential violations of the Criminal Code Amendment Act of 2019 include: 
    • Providing a content service or hosting service which can be used to access abhorrent violent material.  
    • Failing to ensure expeditious removal or cease hosting of it following notification from authorities.  
    • Failing to refer details to the Australian Federal Police after becoming aware of such content being available on their service. 

  • The e-Safety Commissioner can initiate investigations relating to online content and is able to enforce actions like issuing notices:  
    • The Commissioner can block access in Australia to certain content hosted overseas, by notifying the Australian ISPs about the content.  
    • The e-Safety Commissioner can issue a notice, under the Criminal Code Amendment Act 2019, triggering the presumption that a service provider has been “reckless” about its service hosting abhorrent violent material. 

  • The Online Safety Bill introduces the following:  
    • 24-hour deadline for Online Service Providers when receiving a notice from the eSafety Commissioner for image-based abuse, cyber-abuse, cyber-bullying, and seriously harmful content.  
    • Expanded cyber-bullying scheme for children, which enables the removal of material from online services including social media platforms, games, websites, messaging and hosting services. 

    • Basic online expectations to establish mandatory reporting requirements that will allow the eSafety Commissioner to require online services to provide specific information about online harms. This could include information about responses to terrorism and abhorrent violent material, or volumetric attacks. Services will have to respond on how they will uphold these expectations and they can be penalised if they fail to report.  
    • An update to Australia’s Online Content Scheme. This reflects and simplifies the current regime in Schedules 5 and 7 of the BSA, with some clarifications of material and providers of services captured by the scheme, and extends the eSafety Commissioner’s take-down powers for some material to international services in some circumstances. This includes bodies and associations that represent sections of the online industry may develop industry codes.  
    • A new cyber abuse scheme allows the eSafety Commissioner to remove seriously harmful abuse online when websites, social media and other online services do not remove content after a complaint is made.  
    • These protections will be backed by civil penalties for service providers who fail to comply.  
    • Extended powers for the eSafety Commissioner:
      • eSafety Commissioner has a new rapid website blocking power. This can be used to block websites hosting abhorrent violent or terrorist material during an online crisis event.  
      • eSafety Commissioner can require search engines and app stores to remove access to a website or app that “systematically ignores” take down notices for class 1 material under the online content scheme, such as child sexual abuse material. 

    Tech Against Terrorism Commentary[mfn]The below comments can also be found in our submission and recommendations to the Online Safety Act’s consultation, see here.[/mfn] 

    e-Safety Commissioner 

    It is commendable that the main body in charge of coordinating and encouraging action from tech companies, the e-Safety Commissioner, has a clear legal standing. This ensures that several of the instruments provided (such as removal orders) are carried out in accordance with the rule of law. It is also positive that the e-Safety Commissioner produces annual reports on their performance, including on their assistance and investigations. 

    Rule of Law 

    The Online Safety Act risks leading to extensive takedown of legal (but ‘harmful’) speech. For example, whilst cyber bullying and abuse are issues that tech companies should counter for ethical reasons, compelling them to do so under threat of potential liability and financial penalties risks undermining the rule of law. Whilst some aspects of bullying and abuse are anchored in Australia’s criminal code, the definitions provided in the Act suggest that the law will potentially lead to removal of large amounts of legally allowed speech. In a democracy, speech that is legal offline should not be illegal in the online space. If harms need countering online, they should be prohibited in law before legislation is created to remove such content from the internet. 

    The tech sector should not develop codes that can subsequently be introduced into law with legal liability and subsequent financial penalties. Whilst improved industry codes should be encouraged, it is important that legislation is determined by democratically accountable institutions. Thus, there are some concerns regarding the legality of the development of industry codes – which may be developed by bodies and associations that represent sections of the online industry – within the Online Safety Act. 

    Freedom of Expression 

    We are concerned the Online Safety Act has no clear references to safeguards that prevent the erroneous removal of content as a result of blocking or removal requests. This is particularly serious for link deletion and app removal requests, as these are severe steps with a potentially detrimental impact to freedom of information if carried out over extensively. Furthermore, there is no reference to redress mechanisms in the Act. 

    There are a number of imprecise definitions that we believe will negatively impact freedom of expression. The definitions provided for child cyber bullying and cyber abuse seem to build on a perceived ‘common sense’ approach as opposed to legal concepts. This therefore risk decisions being assessed subjectively. Not only could this lead to the removal of legal content, but it will also be difficult to operationalise for tech companies. 

    We have some concerns around the Abhorrent Violent Material (AVM) scheme. Whilst the scope of the law is clear, we worry that imprecise definitions of “terrorist act” and calls for companies to remove content “expeditiously” could encourage tech platforms to remove content that is shared with the purpose of documenting terrorist offences and war crimes. Such content can serve as crucial evidence in court proceedings. We appreciate the necessity to restrict access to content that risks becoming viral in the immediate aftermath of a terrorist attack. However, due to the drastic measures that the Act allows for, the Government should ensure that there are sufficient safeguards in place in case of wrongful blocking and that appropriate redress mechanisms are identified. 

    Smaller tech companies and tech sector capacity  

    The Online Safety Act does not explicitly refer to smaller tech companies, which often do not have the capacity to take swift action due to limited staff numbers or subject matter expertise on various harm areas. Since it is well-established that terrorists predominantly exploit smaller platforms for exactly this reason[mfn]To read more about this, please see our analysis on ISIS’s use of smaller platforms and the DWeb to share terrorist content – April 2019 here[/mfn], it is disappointing this is not reflected in the Act. Specifically, we worry that instruments such as the removal and blocking deadlines of 24 hours (which are punishable by steep fines) will severely harm competition and innovation. 

    Lack of Consultation  

    The Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 amendment law passed through both houses of parliament in a remarkably short time. Similarly, the Online Safety Act entered parliament only 10 days after the public consultation on the Act closed. This limits the possibility of consultation from the industry or civil society, or for policymakers to amend draft legislations in time to incorporate recommendations submitted during a consultation process. 
     

    _____________ 

    Tech Against Terrorism participated in the consultation for the Online Safety Act. To read our full submission and recommendations, see here