Our weekly review of articles on terrorist and violent extremist use of the internet, counterterrorism, digital rights, and tech policy.


Terrorist and violent extremist use of the internet

– The risks of a Telegram crypto-wallet:  In 2018, Telegram started to raise funds for its upcoming cryptocurrency Grams, to be supported by the platform’s blockchain, the Telegram Open Network. However, Telegram’s cryptocurrency ambitions have been halted by a decision of a US federal judge in favour of the US Securities and Exchange Commission (SEC), following Telegram’s failure to register its digital asset with SEC. Developing on this decision and on Grams, Andrew Mines analyses the risks associated with Telegram integrating a crypto-wallet into its encrypted messaging platform. Mines particularly focuses on the exploitation of Telegram by terrorists and violent extremists, stressing that such actors have already been exploiting cryptocurrencies to raise money and transfer funds anonymously. In doing so, Mines stresses the risks of facilitating terrorist and violent extremists’ use of cryptocurrencies to bypass financial regulations, and to easily and anonymously transfer anonymous funds through a widely used encrypted messenger app. (Mines, GNET, 04.05.2020)

– How do social media users talk about terrorism online?: How individuals who are “curious” discuss terrorism online remains largely “shrouded in secrecy.” Here, Dr. Alton Chua and Dr. Snehasish Banerjee analyse how terrorism is talked about on community question answering sites (CQAs) – platforms where users submit questions that are then answered by other users – particularly Yahoo! Answers. While some of the questions they studied were rather harmless, merely demonstrating “the online community’s information needs,” others were tainted with radicalisation, discrimination, and hate speech. Their results underline the potential of CQAs being exploited by terrorist actors. In light of this, Drs. Chua and Banerjee conclude with some recommendations on how CQAs can play their part in tackling terrorist use of the internet. In particular, they suggest removing anonymity options and deploying filters on sensitive language to counter the trolling culture present on such platforms – which often favours hate speech. (Dr. Chua & Dr. Banerjee, VoxPol, 29.04.2020)


Islamist terrorism

Salafist groups’ use of social media and its implications for prevention: In a recent article also published on GNET, Manjana Sold developed on the links between online and offline radicalisation processes, and especially on how offline elements are reflected in the online world. Here, Hande Abay Gaspar continues on the interlinks between online and offline radicalisation, this time focusing on how radical Islamist groups exploit social media for attracting new members and fostering “group maintenance.” Gaspar analyses these links by developing on how real-life activities of these groups are documented and promoted online to publicize the groups’ existence, and on how consumption and reaction to such material then furthers a “cycle of group attachment and group maintenance”. (Abay Gasper, GNET, 28.04.2020)

Germany bans Lebanese Hezbollah: In this article, Linda Schlegel provides some background and analysis to the recent decision by the German Ministry of Interior to entirely ban Hezbollah in the country (the organisation had previously been under partially ban). According to Schlegel, Hezbollah has been present in Germany for “many years” and divided under multiple organisations, having built an important network of affiliates – a presence that could “no longer be tolerated” by the government due to Hezbollah’s terrorist financing activities. Schlegel underlines that this ban will give leverage to law enforcement agencies in the country ahead of anti-Israel protests on the last day of Ramadan. She concludes by arguing that this decision is more a symbolic one, in particular given the EU’s lack of unity when it comes to Hezbollah: the armed branch of the group was designated as a terrorist organisation by the EU in 2013, but Member States have acted on this differently. (Schlegel, European Eyes on Radicalization, 06.05.2020)


Far-right violent extremism and terrorism

CARR guide to online radical-right symbols, slogans and slursThe Centre for Analysis of the Radical Right has released its latest report on far-right violent extremist online environments, focusing on key images and terminology used by violent extremists. While the first part of the report covers imagery used by far-right violent extremists, detailing the changes in use over time, the second part focuses on the languages and numerical codes and is supported by a “glossary appendix”. An essential read for those studying far-right violent extremism online environments. (CARR, 04.05.2020)


Far-left violent extremism and terrorism

The dark history of America’s first female terrorist groupIn November 1983, a bomb exploded in the US Capitol’s north wing, costing 1$ million in damage. No one was killed nor injured as a warning call was made 5 minutes prior to the attack. The attack was claimed by the Armed Resistance Unit, or the May 19th Communist Organisation, the US’ first “terrorist group entirely organized and led by women.” In this article, William Rosenau develops on the history of the May 19th and its armed struggle in the 1970s and 80s in the US. In particular, he dwells on its prominent figures and their shift from left-wing political activism to violence – as well as on the group’s links with other left-wing organisations. (Rosenau, Politico, 05.03.2020)


Counterterrorism

Militants, fringe groups exploiting COVID-19, warns EU anti-terrorism chief: Reuters reports on a discussion with Gilles de Kerchove, EU Counter-Terrorism Coordinator, on the exploitation of the coronavirus crisis by terrorists and violent extremists. According to Reuters, De Kerchove is soon to circulate a paper to Member States underlining concerns with far-right violent extremist groups reaching out to confined individuals and encouraging them to infect “enemies.” Concerns also include the Islamic State (and related groups) activities in Iraq, Syria, and in the Sahel Region. The article concludes with De Kerchove stressing the need for “heightened vigilance,” as talks about the risk of a terrorist group developing a biological weapon have a long history within counterterrorism circles. (Reuters, 30.04.2020)


Tech policy

We are a new board overseeing Facebook. Here’s what we’ll decideFirst announced by Facebook in November 2018, the company’s independent oversight body is set to become operational this year. The Oversight Board will have to respond to the difficult question of what content should be taken down on Facebook and Instagram. In this open letter, the co-chairs of this new oversight board – Catalina Botero-Marino, Jamal Green, Michael W. McConnel, and Helle Thorning-Schmidt – develop on how this independent body will work to provide answers to the “most challenging content issues for Facebook. Areas of focus will include “hate speech, harassment, and protecting people’s safety and privacy,” as well as Facebook’s commitment to oblige with board’s decisions. (Botero-Marino, Greene, McConnell, Thorning-Schmidt, The New York Times, 06.05.2020)

Further discussion on the Oversight Board includes: 
The republic of Facebook (UN Special Rapporteur David Kaye, Just Security, 06.05.2020)
Facebook’s oversight board includes journalists, lawyers and activists (Bonifacic, Engadget, 06.05.2020)
Facebook’s ‘Oversight Board’: Move fast with stable infrastructure and humility (Douek, North Carolina Journal of Law and Technology, 2019)

– To apply machine learning responsibly, we use it in moderationIn this article, Matthew J. Salganik and Robin C. Lee investigate the behind the scenes of the New York Times’ content moderation process. Salganik and Lee begin by detailing on the history of the process, explaining how it went from a human-moderation only process, to a machine-learning pre-screen of comments using Jigsaw’s Moderator system. Salganik and Lee then explained how they tested the Moderator system and went through content moderator training to fully comprehend the system and understand its limitations. They did so with a particular focus on trying to identify potential discrimination bias in the machine learning moderation process. Their investigation was led by concerns with automated moderation, and these concerns are reflected in their recommendations that stress the need for continual oversight of socio-technical systems, as well as the importance of human moderation to oversee automated moderation. (Salganik & Lee, The NYT Open Team, 30.04.2020)

– The EARN IT Act is a disaster amid the COVID-19 crisis:In this piece, Riana Pfefferkorn develops on the EARN IT Act draft, a bill introduced by the Senate Judiciary Committee that would no longer guarantee online platforms’ immunity from “liability for the actions of their users.” Under this new act, online service providers would have to demonstrate that they comply with a set of “best practices” to tackle child sexual exploitation on their platforms. Stressing that tech platforms already report on child sex abuse material (which is illegal under federal law), Pfefferkorn develops on the risks that the EARN IT act bears for individual’s privacy and data security, especially sensitive data. Pfefferkorn argues that this bill is in line with the US Department of Justice’s agenda to battle end-to-end encryption. In doing so she reminds us of the importance of end-to-end encryption for privacy and security and of the risks that a backdoor to encryption would pose, especially as we move online more due to the Covid-19 crisis. (Pfefferkorn, Brookings, 04.05.2020)

The sale of the dot-org registry to a private equity firm was just blocked. Here’s why it matters:In 2019, the Internet Society announced that it was planning on selling the  dot-org domain to a for-profit private equity firm, handing over the control of a domain used by millions, including a variety of civil societies, non-profits, and international organisations such as the United Nations. A plan for sale was blocked last week by the Internet Corporation for Assigned Names and Numbers (ICANN) after campaigning by various civil society organisations. In this article, Brett Solomon, Executive Director of Access Now, dwells on the importance of this decision to ensure that civil societies and non-profits remain “insulated from private sectors interests” through the dot-org domain. This is most important at a time where the COVID-19 pandemic has forced civil societies to fill in the gaps left by “governments and corporations.” Solomon concludes by stressing that this decision is not the end of the troubles for the dot-org domain, as a solution remains to be found since the Internet Society “no longer wants to control” it. (Brett Solomon, Los Angeles Times, 01.05.2020)


For any questions, please get in touch via:
[email protected]


Background to Tech Against Terrorism

Tech Against Terrorism is an initiative launched by the United Nations Counter Terrorism Executive Directorate (UN CTED) in April 2017. We support the global technology sector in responding to terrorist use of the internet whilst respecting human rights, and we work to promote public-private partnerships to mitigate this threat. Our research shows that terrorist groups – both jihadist and far-right terrorists – consistently exploit smaller tech platforms when disseminating propaganda. At Tech Against Terrorism, our mission is to support smaller tech companies in tackling this threat whilst respecting human rights and to provide companies with practical tools to facilitate this process. As a public-private partnership, the initiative has been supported by the Global Internet Forum to Counter Terrorism (GIFCT) and the governments of Spain, Switzerland, the Republic of Korea, and Canada.