For our full submission, please see here.
Submitted: 14 October 2021
1. BACKGROUND
The Australian Parliament passed the Criminal Code Amendment (Sharing Abhorrent Violent Material) Act 2019 on 4 April 2019. On 9 September 2021, the Parliamentary Joint Committee on Law Enforcement (the committee) agreed to inquire into and report on the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 (AVM Act), with particular reference to:
2. TECH AGAINST TERRORISM’S CORE ARGUMENTS AND RECOMMENDATIONS
The arguments in our response to the consultation on the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 (AVM Act) may be summarised as follows:
1. The Act’s high penalties and short timeframe for content removal creates dangerous incentives for tech companies to err on the side of over-removal of content in order to avoid the threat of liability and harsh penalties. Among our concerns regarding risks to freedom of speech are tight removal deadlines in combination with high fines for platforms who are unable to moderate their platforms, which might make companies err on the side of over-removal. This significantly undermines human rights, particularly freedom of speech.
2. The Act lacks consideration for smaller platforms in terms of its penalties. Not only does the AVM Act apply to companies of all sizes, but in theory a penalty could also be imposed for every individual piece of content posted (including duplicates). Imposing stringent legal requirements with no regard for platform size will harm the diversity and innovation that drives the tech sector. Stringent legal requirements will disproportionately impact smaller tech companies with fewer resources to support compliance, whereas larger tech platforms will more likely be able to allocate the resources necessary to comply with the Act.
3. The Act creates a level of uncertainty through lacking operable definitions or clarity where it is crucial. The Act lacks operable definitions of what is considered “expeditious”. This creates serious consequences for compliance with the legislations as well as freedom of expression when the uncertainty is coupled with harsh fines and penalties.
4. The Act empowers the e-Safety Commissioner to issue a notice triggering the presumption that a service provider has been “reckless” about its service hosting abhorrent violent material, which is difficult for the service provider to prove otherwise. This provides the incentive for companies to err on the side of over-removal of content in order to hinder liability.
5. The Act maintains an extraterritorial approach for removal of abhorrent violent material, which could effectively see national laws being implemented globally, raising questions about territoriality. Any social media company that can be used to access footage of abhorrent violent material occurring anywhere in the world can be liable for this offence provided that it is “reasonably capable of being accessed within Australia”.6 This means that there is an extraordinary potential reach of this offence.
6. The Act was passed through both houses of parliament in a remarkably short time. This limited the possibility of any consultation from the industry or civil society, or for policymakers to amend draft legislations in time to incorporate recommendations submitted during a consultation process.
For our full submission, please see here.