2 min read
Tech Against Terrorism’s Written Evidence to the Inquiry into the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019
Fred McDonnell Oct 22, 2021 9:22:27 AM
For our full submission, please see here.
Submitted: 14 October 2021
1. BACKGROUND
The Australian Parliament passed the Criminal Code Amendment (Sharing Abhorrent Violent Material) Act 2019 on 4 April 2019. On 9 September 2021, the Parliamentary Joint Committee on Law Enforcement (the committee) agreed to inquire into and report on the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 (AVM Act), with particular reference to:
- the effectiveness of the AVM Act in ensuring that persons who are internet service providers, or who provide content or hosting services, take timely action to remove or cease hosting abhorrent violent material when it can be accessed using their services
- the effectiveness of the AVM Act in reducing the incidence of misuse of online platforms by perpetrators of violence;
- the appropriateness of the roles and responsibilities of the eSafety Commissioner and Australian Federal Police under the AVM Act;
- the appropriateness of the obligations placed on persons who are internet service providers, or who provide content or hosting services, under the AVM Act;
- the definition of abhorrent violent material under the AVM Act;
- and, any related matter.
2. TECH AGAINST TERRORISM’S CORE ARGUMENTS AND RECOMMENDATIONS
The arguments in our response to the consultation on the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 (AVM Act) may be summarised as follows:
1. The Act’s high penalties and short timeframe for content removal creates dangerous incentives for tech companies to err on the side of over-removal of content in order to avoid the threat of liability and harsh penalties. Among our concerns regarding risks to freedom of speech are tight removal deadlines in combination with high fines for platforms who are unable to moderate their platforms, which might make companies err on the side of over-removal. This significantly undermines human rights, particularly freedom of speech.
2. The Act lacks consideration for smaller platforms in terms of its penalties. Not only does the AVM Act apply to companies of all sizes, but in theory a penalty could also be imposed for every individual piece of content posted (including duplicates). Imposing stringent legal requirements with no regard for platform size will harm the diversity and innovation that drives the tech sector. Stringent legal requirements will disproportionately impact smaller tech companies with fewer resources to support compliance, whereas larger tech platforms will more likely be able to allocate the resources necessary to comply with the Act.
3. The Act creates a level of uncertainty through lacking operable definitions or clarity where it is crucial. The Act lacks operable definitions of what is considered “expeditious”. This creates serious consequences for compliance with the legislations as well as freedom of expression when the uncertainty is coupled with harsh fines and penalties.
4. The Act empowers the e-Safety Commissioner to issue a notice triggering the presumption that a service provider has been “reckless” about its service hosting abhorrent violent material, which is difficult for the service provider to prove otherwise. This provides the incentive for companies to err on the side of over-removal of content in order to hinder liability.
5. The Act maintains an extraterritorial approach for removal of abhorrent violent material, which could effectively see national laws being implemented globally, raising questions about territoriality. Any social media company that can be used to access footage of abhorrent violent material occurring anywhere in the world can be liable for this offence provided that it is “reasonably capable of being accessed within Australia”.6 This means that there is an extraordinary potential reach of this offence.
6. The Act was passed through both houses of parliament in a remarkably short time. This limited the possibility of any consultation from the industry or civil society, or for policymakers to amend draft legislations in time to incorporate recommendations submitted during a consultation process.
For our full submission, please see here.
- News (252)
- Counterterrorism (55)
- Analysis (52)
- Terrorism (39)
- Online Regulation (38)
- Violent Extremist (36)
- Tech Responses (35)
- Press Release (33)
- Regulation (33)
- Europe (31)
- Government Regulation (27)
- Academia (25)
- GIFCT (22)
- Reports (22)
- UK (22)
- US (20)
- USA (19)
- Guides (17)
- Law (16)
- UN (15)
- MENA (13)
- Asia (11)
- ISIS (11)
- Workshop (11)
- Presentation (10)
- Fintech (6)
- Far-right (5)
- Threat Intelligence (5)
- Webinar (5)
- Propaganda (3)
- Region (3)
- Submissions (3)
- Generative AI (2)
- Op-ed (1)
- Tech Against Terrorism Network (1)
Tech Against Terrorism response to the EU’s terrorist content online regulation