Our weekly review of articles on terrorist and violent extremist use of the internet, counterterrorism, digital rights, and tech policy. |
Top Stories
- Ajit Pai, a Chairman of the US Federal Communication Commission (FCC), has published a statement calling for "clarify[ing] ambiguities in section 230", and for the FCC to interpret Section 230 – and thus the provision on tech companies' immunity from legal liability. You can read more about this here.
- Tech Against Terrorism is proud to announce that the UK Safety Tech Innovation Network has added us to its list of safety tech providers.
- Facebook has announced that it would remove content that “denies or distorts the Holocaust”, in an update of its hate speech policy.
- The Proud Boys’ website and online store has been deplatformed by its hosting service – a Google Cloud customer.
- The Global Network Initiative (GNI), has published a new policy brief analysing proposals for regulating the online space from a human rights perspective, and drawing recommendations for policymakers.
- European Digital Rights (EDRi) has published its 2019 Annual Report, which provides an overview of its advocacy work and insights on the state of digital rights globally.
- Freedom House has released its Freedom of the Net 2020 report, which expands on the “race towards ‘cyber sovereignty’”, characterised by governments pushing for online regulation, and on the effects of Covid-19 on internet freedom.
Tech Against Terrorism’s Online Regulation Series continues throughout October, to provide you with a global overview of the state and future of online regulation. The series is underpinned by the work of digital and human rights organisations, such as GNI, EDRi, and Freedom House, that we will be sharing on a regular basis during the Series. Make sure to follow us on Twitter and to visit our website to follow the Series!
This week the Series focused on North America, you can find our blog post about the US here, and the on Canada here.
Tech policy
- Thank you for your transparency report, here’s everything that’s missing: Svea Windwehr and Jillian York, from the Electronic Frontier Foundation (EFF) argue for tech platforms to move away from a transparency reporting template focused on the number of takedowns towards “meaningful transparency”. Whilst they welcome transparency efforts by tech companies, Windwehr and York criticise existing transparency reports templates which they say lack explanation of the “behind-the-scenes" of content removal (the how and why). According to them, such reporting “misses the mark” by not allowing outsiders to “see and understand” removal actions. Rather, they argue that platforms should engage in “meaningful transparency to be accountable to their users. Namely, platforms should clarify the basics of moderation, such as the scale of the moderation team, the language capacity, the technologies used, the human-machine moderation balance. As well as, better inform users by clearly detailing moderation policy, and by being transparent about the development process – and the external actors that contribute to it. In line with this call for “meaningful transparency”, the Santa Clara Principles on Accountability and Transparency in Content Moderation are currently undergoing a review process. (Windwehr and York, Electronic Frontier Foundation, 13.10.2020)
Jillian York presented at a webinar that we organised on tech platform accountability in July, which you can read more about here.
Far-right violent extremism and terrorism
- Tackling online radicalization at its offline roots:
William Baldet, develops on the “online radical right rabbit hole” and how far-right violent extremist propaganda drives young individuals down alternative sub-cultures. Baldet provides an overview of how violent extremists use social media to spread their racist memes and “disturbing humour”, profiting from platforms’ algorithms to propagate their views. In addition, the groups signpost to “deeper recess” in order to attract individuals to the more extreme and explicit content found on closed or encrypted platforms, as well as on platforms unwilling or incapable of moderating. Given the challenges posed by online extremist content (from identification to removal at scale), which are further complicated by the fact that most content shared by far-right violent extremists is not illegal, Baldet argues that other ways to disrupt radicalisation are needed. In doing so, he calls for looking beyond the online space and into the offline cultural and political environments, as well as social-psychological factors that are entangled with online radicalisation. He particularly stresses the possibility to “create psychological immunity” against misinformation and online conspiracies by explaining tactics of recruitment to young individuals. He also underlines the promotion of analytical thinking and alertness, as well as the idea of “prebunking” misinformation – exposing individuals to small doses of common techniques of recruitment to “generate mental antibodies” to recruitment tactics and radicalisation. Baldet concludes by calling for counter violent extremism to consider the offline roots of radicalisation, an in particular for finding common grounds between the Left and the Right to deescalate drivers of extremism. (Baldet, RANTT Media, 13.10.2020) - This week we are listening to,Two Minutes Past Nine, a BBC Podcast by Leah Sottile that takes a deep dive into the Oklahoma City Bombing and Timothy McVeigh.
Islamist terrorism
- The rise of lockdown radicalism: Farooq Yousaf provides an overview of the risk of radicalisation linked to the deteriorated socio-political conditions that ensued the Covid-19 crisis, focusing on South Asia. In particular, he points out how terrorist groups have been “trying to cash in on these inequalities to propagate hate-filled narratives” throughout the health crisis. Yousaf also notes that different experts, including the Australian Security Intelligence Organisation, have warned that the conditions surrounding the pandemic are favourable to extremist activities, and that the UN Security Council has noted “a spike in the so-called Islamic State’s (IS) online activities” recently. Yousaf further links these risks of radicalisation to the “family terror networks” of IS, underlining the role such family networks play in the group’s global operations and recruitment strategies. He concludes that the “twin threats of virtual and physical recruitment” remain important challenges for countries in the Global South that might lack the necessary resources to counter the threat. (Yousaf, East Asia Forum, 14.10.2020)
- An Arrest in Canada casts a shadow on a New York Times star, and the Times: Ben Smith dwells on the consequences for media coverage of terrorism following a recent arrest in Canada of a man, calling himself Abu Huzayfah, who falsely claimed to have joined the Islamic State (IS) in Syria. Smith looks at how this arrest cast doubts on the work of a New York Times journalist a recent podcast series, Caliphate, which relies on the stories divulged by this man. Smith analyses how coverage of terrorism lends itself to “the seductions of narrative journalism” with reporters relying on “untrustworthy sources” and “ambiguous” facts. Smith continues by stressing the impact such journalism can have, as it can lend itself to “play easily into popular American hostility towards Muslims” and influence counterterrorism policy. (Smith, The New York Times, 11.10.2020)
Responsible reporting when covering terrorism is essential to ensure that news media do not become “PR for terrorists”. To learn more about this, you can listen to the Tech Against Terrorism Podcast on “How mainstream media can spread terrorist propaganda”, with Kyle Taylor from Hacked Off, and Abdirahim Saae from BBC Monitoring.
Counterterrorism
- Macron’s plan for fighting Islamist radicalisation – offline: Dr. Julian Junk and Clara Auguste Süß review President Macron’s 5-point plan to counter Islamist radicalisation in France. This plan is set to target the “Islamist separatism” identified to underpine the radicalisation threat in the country, and focuses on: the neutrality of public services (and servants); public funding of association and civil societies to be made dependent on the organisations adhering to the French Republican principles; the role of education and the school system; the State’s support for an “Islam des Lumieres” (“enlightened Islam”) compatible with laicism and republican principles; and finally on the importance of “enforcing existing law(s) and restoring respect”, especially in the “lost neighbourhoods” banlieues where radical Islam is said to proliferates. Junk and Süß argue that, whilst ambitious, this plan risks falling “into the same trap” as other European attempts at prevention, which over-focus on security-related measures and Islamist groups, “los[ing] sight of a much larger societal problem, the one of right-wing extremism as well as co-escalation dynamics in between various extremist groups”. The authors conclude by underlining the absence of consideration for the online-offline nexus of radicalisation dynamics, which they assess to be a “wide gap” in an otherwise “holistic ambition”. (Dr Junk and Süß, GNET, 12.10.2020)
For any questions, please get in touch via: contact@techagainstterrorism.org |
- News (252)
- Counterterrorism (55)
- Analysis (52)
- Terrorism (39)
- Online Regulation (38)
- Violent Extremist (36)
- Tech Responses (35)
- Press Release (33)
- Regulation (33)
- Europe (31)
- Government Regulation (27)
- Academia (25)
- GIFCT (22)
- Reports (22)
- UK (22)
- US (20)
- USA (19)
- Guides (17)
- Law (16)
- UN (15)
- MENA (13)
- Asia (11)
- ISIS (11)
- Workshop (11)
- Presentation (10)
- Fintech (6)
- Far-right (5)
- Threat Intelligence (5)
- Webinar (5)
- Propaganda (3)
- Region (3)
- Submissions (3)
- Generative AI (2)
- Op-ed (1)
- Tech Against Terrorism Network (1)