4 min read

Reader's Digest – 28 August 2020

Reader's Digest – 28 August 2020

Our weekly review of articles on terrorist and violent extremist use of the internet, counterterrorism, digital rights, and tech policy.

Top Stories

  • YouTube released its second quarterly  Community guidelines enforcement report, detailing their metrics for the period of April to June. In addition, Youtube provided a blogpost explaining the company's policies in response to COVID-19.
  • Discord has published its transparency report for the first half of 2020, detailing their updated community guidelines, their new safety centre and how Discord was affected by COVID-19. 
  • The Global Network Initiative (GNI) has announced that Cloudflare has joined their network as an observer. Cloudflare’s General Counsel, Doug Kramer, said that their new role will help Cloudflare structure their efforts in preserving and promoting human rights further. 


Tech policy

  • Content moderation is particularly hard in African Countries: In this article, Tomiwa Ilori analyses challenges of content moderation in Africa. Ilori highlights that challenges in the region are particularly related to the complexity of applying a global content moderation standard to local contexts. Whilst these challenges are present across countries, Ilori argues that factors of colonial legacy, a shrinking civil space, and government restrictions on online speech make content moderation particularly difficult in Africa. Regarding colonial legacy, Ilori describes how existing laws made in the colonial era, which were more restrictive on rights such as freedom of speech, are now applied to online speech. African countries are working to change these laws, however Ilori argues that at the moment, these laws risk inaccurately deeming online content a security threat. Furthermore, Ilori argues that due to some governments limiting civil society, human rights organisations have a declining role in raising the potential risks of content moderation. Ilori concludes a multi-stakeholder approach, such as Article 19’s Social Media Councils, with diverse expertise, shows most promise in tackling these problems in Africa and need to be developed further. (Ilori, Slate, 21.08.20).  

  • After sending content moderators home, YouTube doubled its video removals: In this article, Issie Lapowsky discusses YouTube’s latest transparency report (see top stories). The report mentions an increased reliance on automated content removal – which has led to a doubling in removals on the platform. This comes as a result of YouTube’s employees having to stay at home because of COVID-19, which led to the company expanding its use of automated filters to compensate the absence of human moderators. Consequently, YouTube has deleted more content in the second quarter of 2020 than in the first quarter, including “violent extremist content”. As an explanation, YouTube has said they relied on automated systems that might be “less accurate” in deleting the content, but that do indeed remove extreme violations such as child sexual abusive material and violent extremist content from their platform. Due to this policy, the second quarter also shows double the user appeals as a response to the spike in removed content. Lapowsky compares these results to Facebook’s latest transparency report, which contrary to YouTube saw a decline in content taken down due to its moderators not being able to log harmful content into the automated systems that aid content moderation. (Lapowsky, Protocol, 25.08.20). 

  • Content moderation knowledge shouldn't be a backdoor to cross platform censorship: In this article, Emma Llanso cautions that knowledge sharing amongst tech companies regarding online content moderation, including child sexual abusive material or terrorist content, could potentially lead to cross-platform censorship. Llanso describes several ways in which platforms share information to support content moderation, including via technical approaches such as the GIFCT’s hash-sharing database. Llanso notes that there are several advantages with this, for example in supporting small tech companies that often cannot tackle various harmful content types due to limited capacity. However, she warns that intra-industry knowledge sharing can also lead to the establishment of “de-facto standard of acceptable speech across the Internet”, as a consequence of smaller companies adopting the same standards of content moderation as bigger companies. In addition, it might also synchronise content moderation responses since according to Llanso larger companies are more likely to share resources with smaller companies, thereby creating a one-way street. Instead, Llanso suggests that knowledge sharing should be a two-way street, and notes that smaller companies are potentially in a better position to create innovative approaches to content moderation that digress from “mainstream” solutions. (Llanso, Techdirt, 21.08.20).  

To learn more about Tech Against Terrorism’s work with smaller tech companies, please click here.
 

This week, we’re listening to Emma Llansó on the most important content moderation database you’ve never heard of, by Evelyn Douek and Emma Llanso, on the shared database of the Global Internet Forum to Counter Terrorism (GIFCT), which aids platforms’ content moderation decisions.  

Islamist terrorism

  • I will tell you a story about Jihad: IS's Propaganda and Narrative Advertising: In this article, Anna Kruglova applies a framework of narrative advertising on Islamic State (IS) magazines, Dabiq and Rumiyah, to showcase the success of IS’ propaganda strategies. The narrative advertising approach views IS as a type of brand, with narratives used to promote its message and gain new “customers”, or recruits. Kruglova argues that IS used this narrative advertising emotively, which made recruits emotionally connected to the group’s narratives and caused them to lose connection with the real world. To demonstrate this, Kruglova identifies a number of key narratives that IS instrumentalised which all addressed recruits’ psychological needs, and appealed to people’s self-image and self-perception rather than their political and socio-economic circumstances. As a result, Kruglova concludes counter-narratives need to target the emotional component and go beyond mere ideological counter-messaging. (Kruglova, GNET, 25.08.20).  


Far-right violent extremism and terrorism

  • Moving away from Islamist extremism - assessing counter-narrative responses to the far-right online: In this article Dr. William Allchorn stresses the importance of utilising counter-narratives in countering far-right extremism online. Whilst counter-narratives and other related counterterrorism research often focusses on Islamist extremism, Allchorn emphasises that their application to far-right extremism deserves more attention by academics and practitioners. Allchorn  highlights a project by the Centre for Analysis of the Radical Right  and Hedayah that systematically reviewed the counter-narratives used by several countries as well as identifies main recruitment narratives. These counter-narratives use pathways to deconstruct extremist narratives, such as redirecting people away from online extremist content or by entering into difficult conversations with extremists online. Allchorn concludes that empirically proven and theoretically grounded counter-narratives that provide alternative pathways to extremist milieus are essential in countering far-right extremism. (Allchorn, Centre for Analysis of the Radical Right, 26.08.20).   


For any questions, please get in touch via:
contact@techagainstterrorism.org


1 min read

Reader's Digest – 24 April 2020

Our weekly review of articles on terrorist and violent extremist use of the internet, counterterrorism, digital rights, and tech policy. -Terrorist...

Read More