5 min read
Tech Against Terrorism response to the EU’s Digital Services Act
Adam Southey Jan 14, 2021 10:57:30 AM
Tech Against Terrorism response to the EU’s Digital Services Act
Summary
On 15 December the EU unveiled the draft Digital Services Act (DSA), an ambitious proposal aiming to reshape the digital space and introduce new obligations for online platforms operating in the EU. There are several positive aspects of the DSA that could lead to a healthier online environment. However, we remain sceptical whether it will – particularly in combination with the EU’s regulation on terrorist content online – prevent terrorist use of the internet. In our submission to the Digital Services Act (DSA) consultation process, Tech Against Terrorism called on the EU to ensure that government accountability and rule of law are key parts of the DSA, and recommended against introducing measures punishing smaller platforms. Unfortunately, the first draft of the DSA seems to miss the mark on these points.
Recommendations
We recommend that the EU:
- Encourages improved strategic leadership from governments in online counterterrorism efforts, for example via increased accurate and responsible designation of far-right terrorist groups to support smaller tech companies in actioning such content
- Modifies Article 14 of the DSA so as to not risk undermining rule of law and imposing legal liability on tech platforms
- Ensures that smaller tech companies are not punished for lack of capacity to comply with the regulation(s) and support smaller tech companies in improving their response mechanisms
- Clarifies what safeguards are put in place to ensure Digital Services Coordinators’ independence in order to avoid fragmentation in practical enforcement of the DSA
- Clarifies what safeguards are in place to avoid extra-territorial application of national law in the EU
- Complements the transparency requirements for tech companies with similar requirements for governments
- Presents evidence upon which demands for “expeditious” (or one hour) content removal deadlines are based
Positive aspects
The DSA contains several positive aspects. Below are a few worth highlighting:
- The Commission seems to (at least partially) shift the focus towards a systemic approach that encourages accountability and transparency, acknowledging that content moderation decisions are complex and difficult to apply consistently
- Liability protections remain largely in place with what can be described as a model similar to the US “Good Samaritan” principle
- Smaller tech platforms’ capacity (or lack thereof) has been taken into account for several of the obligations placed on tech companies
- There is a focus on redress and transparency in line with the Tech Against Terrorism Pledge
Furthermore, we appreciate the EU Commission’s work via initiatives such as the EU Internet Forum, which we (despite relevant criticism) believe has improved online counterterrorism efforts in the EU.
Areas of concern
We are sceptical whether the DSA – despite its positive aspects – will contribute to preventing terrorist use of the internet. Instead, we have a number of concerns relating to the rule of law, its impact on smaller platforms, the risk of regulatory fragmentation, and about where ownership of online counterterrorism efforts in the Union should lie.
Rule of law
As highlighted by our Online Regulation Series, several governments have introduced or proposed regulation to tackle “harmful” content that risks undermining the rule of law and outsources adjudication of illegality to tech companies. Whilst the DSA is more balanced than many other global regulatory efforts, it leaves several questions unanswered with regards to the rule of law and as to whether democratically accountable institutions or private tech companies should shape online counterterrorism efforts in the EU.
Specifically, we have concerns around the notice and action mechanism outlined in Article 14. Under this scheme, platforms will determine the legality of content reported to them by “any individual or entity”. This effectively outsources adjudication of legality to tech platforms. Courts and judges should make such decisions – not tech companies – and whilst this scheme might prove expedient, it risks undermining the rule of law in the EU. It is disappointing to see that despite warnings against such mechanisms from several UN Special Rapporteurs, civil society groups, and Tech Against Terrorism – both in relation to the DSA, the regulation on online terrorist content, and Germany’s NetzDG law – the EU has decided to proceed with this framework. Whilst the EU has publicly spoken about its willingness to limit the power of big tech, this provision might give big tech more power by making them the de facto arbiters of what is legal and illegal online.
Smaller platforms
Whilst the exemptions offered to smaller platforms are welcome, we are concerned that the user notice mechanism in Article 14 risks removing the liability protections detailed in Articles 3-5. There is a risk that companies (big and small) will opt to remove all content flagged to avoid penalties, which in turn poses questions around freedom of expression. Furthermore, smaller platforms will be held to poorly defined deadlines, with the DSA requiring companies to remove illegal content “expeditiously”. This might lead to smaller platforms being penalised for not having the capacity to process and action notices. Lastly, we fear that imposing financial penalties on smaller and micro-platforms will risk either quelling innovation and competition or simply be ineffective, as several of the micro-platforms where terrorist content is predominantly hosted barely have any revenue.
It is fair to expect tech companies of all sizes to contribute to online counterterrorism efforts. However, by penalising companies for not having sufficient capacity to tackle terrorist use of the internet without providing them with adequate support, we risk setting smaller companies up for failure and thereby hindering the competitive environment the EU says it seeks to create.
Risk of fragmentation and extra-territoriality
The DSA specifies (Article 38) that each Member State shall designate its own Digital Services Coordinator (DSC) to assess tech company compliance with the regulation. However, the draft does in our view not sufficiently specify which types of national bodies should be eligible. Although the draft says that such bodies shall act with “complete independence” and will be supervised by an overarching board, we encourage the EU to clarify which safeguards will be put in place to avoid regulatory fragmentation as a result of potential inconsistency between DSCs.
Furthermore, we encourage the EU to clarify how the regulation will function in relation to other regulations and voluntary frameworks. For example, the United Kingdom’s Online Harms scheme, the OECD, and the Christchurch Call to Action are either compelling or encouraging platforms to be more transparent. Transparency is unequivocally good and is a key part of our engagement programme with companies, however, international bodies and governments will need to coordinate to ensure that such mandates actually lead to meaningful transparency and do not only serve as a means to outbid each other in making demands of tech platforms.
As seen in our Online Regulation Series, several countries have introduced or considered introducing legislation that effectively allows for extra-territorial application of national law. Whilst the DSA does not explicitly encourage such mechanisms, Article 8 does allow for extra-territorial orders. We encourage the EU to clarify what mechanisms will be put in place to avoid abuse of this provision.
Government accountability and strategic leadership
In addition to concerns about specific provisions, there is a broader point around strategic leadership in tackling terrorist use of the internet to be made. Governments have for years (generally) been content with letting larger tech platforms and industry bodies govern online speech and online terrorist content. Often, companies have done this with little strategic leadership provided by governments. Whilst governments have proposed removal deadlines, financial punishments, and voluntary arrangements for public-private cooperation, the burden of defining terrorist content and providing appropriate distinction has fallen on larger tech companies. What this has resulted in is a process in which democratically unaccountable tech companies have set online speech standards, something which EU leaders have in the past week berated despite themselves having proposed legislation that risks entrenching this dynamic. Unfortunately, the DSA does not significantly seem to alter this scheme and seems to divert from what we believe are priorities in terms of tackling terrorist use of the internet.
In our view, it is vital that counterterrorism, whether offline or online, is led by democratically accountable institutions in accordance with the rule of law. This is why we call for governments and institutions like the EU to accurately and responsibly designate terrorist groups, since this helps tech companies, and particularly smaller platforms, identify terrorist content on their sites. Designation is also an effective way to improve action from occasionally hostile so-called alt-tech platforms who would otherwise shelter such content under the rubric of free speech. We believe that the investment in regulations such as the terrorist content regulation and the DSA can be directed towards clarifying rule of law compliant approaches to countering terrorist operated websites, which are increasingly important propaganda tools for terrorist groups and where leadership from governments with regards to countermeasures is currently missing. As recent events have shown, there is uncertainty with regards to what principles web infrastructure providers should apply to platforms or websites hosting violent (including terrorist) content, and it is our assessment that government leadership in this regard has been lacking.
The DSA and the regulation on terrorist content online could have been an opportunity for the EU to help provide strategic leadership and to introduce mechanisms that will support smaller tech platforms in tackling terrorist use of the internet. However, it is our assessment that this is not currently reflected in the most recent texts of either regulation.
- News (248)
- Counterterrorism (55)
- Analysis (52)
- Terrorism (39)
- Online Regulation (38)
- Violent Extremist (36)
- Tech Responses (35)
- Regulation (33)
- Europe (31)
- Press Release (29)
- Government Regulation (27)
- Academia (25)
- GIFCT (22)
- Reports (22)
- UK (22)
- US (20)
- USA (19)
- Guides (17)
- Law (16)
- UN (15)
- Asia (11)
- ISIS (11)
- Workshop (11)
- Presentation (10)
- MENA (9)
- Fintech (6)
- Far-right (5)
- Threat Intelligence (5)
- Webinar (5)
- Propaganda (3)
- Region (3)
- Submissions (3)
- Generative AI (2)
- Op-ed (1)
- Tech Against Terrorism Network (1)
Tech Against Terrorism Commends Australia's eSafety Commissioner's Initiative to Combat Terrorist and Violent Extremist Content Online
Tech Against Terrorism applauds Australia's eSafety Commissioner for tackling online extremism and stresses collaboration and transparency in...