Our weekly review of articles on terrorist and violent extremist use of the internet, counterterrorism, digital rights, and tech policy. |
Top Stories
- Ofcom has released a guide on the new requirements for video-sharing platforms (VSPs) in its content regulation approach. Tech Against Terrorism recently offered an initial response to Ofcom’s public consultation, which can be found here.
- The US government and 11 states have launched an antitrust lawsuit against Google and its search practices.
- The EU Internet Referral Unit (EU IRU) has released its annual transparency report, detailing its prevention activities, and the investigative support that was provided when requested by member states in 2019.
- Multiple websites related to 8chan/8kun and QAnon were briefly taken down, through an Internet service provider suspending its services to the websites. They are currently back online through a Russia-based Internet provider.
- YouTube has announced that it is clamping down on harmful conspiracy theories that are “used to justify real-world violence”, such as QAnon and the related Pizzagate theory.
- TikTok has provided an update on how the platform addresses hateful content and behaviour, by identifying "hateful ideologies" no longer accepted on the platform and detailing their transparency reporting practices.
- Facebook has updated its enforcement against QAnon, through redirecting QAnon-related searches to research published by the Global Network on Extremism and Technology (GNET).
- Vimeo has removed QAnon conspiracy theory videos due to the content violating Vimeo’s Terms of Service.
- Hope not Hate has published a comprehensive report on QAnon in the United Kingdom, detailing what the movement is, its threat and the role of social media.
Tech policy
- Digital Services Act should avoid rules on "harmful" content, big tech tells EU: In this article, Samual Stolton discusses the position paper by the EDiMA (the European industry-lobby of online platforms and tech companies), which calls for the EU Digital Services Act to confine the legislation to illegal content only, instead of also incorporating “harmful content”. The EDiMA justifies this by arguing that tech companies’ responsibility to balance freedom of speech and the obligation to protect users from potential harm of online content is more difficult when it comes to “harmful” content, than when assessing illegal content. This is due to the legal basis and mechanisms in place throughout the European Union on what is considered illegal content, whilst what might be considered harmful content is different between individual member states. In light of such a scenario, EDiMA warns of the potential effects on freedom of expression when tech companies will have to adjudicate what is a potential online harm through pan-European legislation, therefore neglecting inter-state differences. The DSA underwent public consultation between June and September this year, and the full package of measures is set to be revealed by the European Commission on December 2. (Stolton, Euroactiv, 15.10.2020).
- Who wants to be a millionaire' - but for terror attacks: game trains police to get Facebook data fast: This article by Thomas Brewster offers insight into a new game developed by Europol, with the purpose of training Europol’s employees to obtain access to tech companies’ information in the case of a terrorist attack. The game starts with a hypothetical terrorist attack, which is livestreamed on a particular tech platform, after which the law enforcement personnel needs to request information and collaborate with the tech platforms in question. The game was developed by the SIRIUS project, established in 2017 as part of Europol’s European Counter Terrorism Centre, to assist with the amount of data available at crime scenes and on suspects. The game is based on Europol’s existing guidance on retrieving information from tech companies and is meant to streamline requests to make them more effective and transparent. In addition, a note is offered on encrypted messaging apps and the retrieval of information. The article points out that even if apps such as WhatsApp or Telegram are encrypted, investigators still have access to the IP connection and the device a perpetrator used to connect to a platform. Brewster concludes that preserving customer’s privacy and protecting people’s safety is hard to navigate, however offers hope that this game will help Europol work more efficiently with tech companies. (Brewster, Forbes, 14.10.2020).
To learn more about the DSA, and individual member states’ online regulation, check out our Online Regulation Series. In this series, we shed light on the regulatory frameworks throughout different regions in the world, for which we covered Europe this week.
Far-right violent extremism and terrorism
- Moskeeën in twee Franse steden onder politiebescherming ("Mosques in two French cities put under police protection"): Jelmer Kos discusses how two mosques, one in the city of Bordeaux and one in Beziers, have been put under police protection from 21 October onwards, following a call for violence against them on social media. The two mosques have become victims of targeted violence as well as have received hate speech on Facebook, including a call to burn the mosque in Bezier down. The increase in extremist hate speech follows the terrorist attack in France, as well as France’s promised crackdown on "Islamist separatism and radicalism". A recent example being the announcement by the France’s Ministry of the Interior, shutting down a mosque that released a video on Facebook in which they criticised the teaching methods of Samuel Paty. (Kos, NRC, 21.10.20).
Islamist terrorism
- French terror attack highlights social media policing gaps: This article by Elisa Braun and Laura Kayali evaluates how the recent terrorist attack in France brought the country’s shortcomings of social media regulation to light. The beheading of a teacher, Samuel Paty, came after the victim had been threatened on social media, where a parent of one of the students at the school shared his anger with the teacher on WhatsApp and Facebook. This caught the attention of more radical groups and individuals, including the attacker. As a response, the government has asked big tech companies in France, including Google, Facebook and Twitter to come up with “concrete proposals” to counter “cyber-Islamism”. However, Braun and Kayali analyse how the problem with France’s content regulation is not a question of imposing new rules, but a question of how to enforce existing ones. Whereas France has existing regulations for tech companies, one NGO points out that “moderation is a question of means”, assessing how tech companies do not always have the ability to abide by the regulation. France’s new regulation has recently been struck down due to issues related to freedom of speech. The article concludes that legislative efforts on content regulation fail to recognise the underlying mechanism of hate speech, or problematic content online, which is that it will continue to exist. The authors argue that France should instead focus on slowing down the speed of extremist content’s spread. (Braun and Kayali, Politico, 19.10.2020).
Last May, France attempted to introduce a more stringent regulation of online content, however it was partly censured by the French Constitutional Council due to the risks posed to freedom of expression. To read more on this, please find our recent addition to the online regulation series here.
For any questions, please get in touch via: contact@techagainstterrorism.org |
- News (255)
- Counterterrorism (55)
- Analysis (52)
- Terrorism (39)
- Online Regulation (38)
- Violent Extremist (36)
- Press Release (35)
- Tech Responses (35)
- Regulation (33)
- Europe (31)
- Government Regulation (27)
- Academia (25)
- GIFCT (22)
- Reports (22)
- UK (22)
- US (20)
- USA (19)
- Guides (17)
- Law (16)
- UN (15)
- MENA (13)
- Asia (11)
- ISIS (11)
- Workshop (11)
- Presentation (10)
- Fintech (6)
- Far-right (5)
- Threat Intelligence (5)
- Webinar (5)
- Propaganda (3)
- Region (3)
- Submissions (3)
- Generative AI (2)
- Op-ed (1)
- Tech Against Terrorism Network (1)