On Wednesday, the UK Government published the draft Online Safety Bill. While there are some ideas in the Bill that are to be encouraged, we are concerned that key concepts underpinning the Bill remain unclear and uncertain.
Our primary concern is that this uncertainty renders the future of smaller platforms in the UK equally uncertain. While larger platforms will most likely be able to manage the new regulations no matter their scope or burden, smaller platforms will find it nigh on impossible to handle many of the obligations and duties that have been proposed.
We encourage the UK Government to provide clarity on the scope of the Bill. We welcome, for example, the principle of proportional regulation throughout the Bill, however we are concerned that despite employing categories for platform size – i.e. Category 1 for larger platforms, Category 2B for smaller platforms – the Bill does not stipulate the criteria for these categories. Instead, Schedule 4 states that the decision of categorisation will be within the purview of the Secretary of State. Given this ambiguity, it is possible that smaller UK platforms will be labelled Category 1 services that will have to comply with the highest regulatory standard, a standard they will not be able to uphold.
We also fear that definitions throughout the Bill are impractically broad. Section s41(5)(a) defines terrorism content as content that leads to a terrorist offence. We see this definition as circular and it does little to inform a tech company about what content falls under it, nor does it inform how tech companies should operationalise this definition when acting against terrorist exploitation of their services. This leaves tech companies to adjudicate on what constitutes terrorist content, when such responsibility should lie with governments.
The diversity of the internet is essential to preserving a free and open online world. If smaller platforms struggle to comply with the new duties and requirements of the Online Safety Bill, we may see many relocate to other jurisdictions, seriously compromising the diversity of the UK online space. We recommend that the UK Government reconsider some of their key definitions – including "harmful content" and "terrorist content" – and clarify the categories of service in conjunction with industry and civil society in order to avoid any confusion that vague or arbitrary definitions may cause.
The UK Government should also focus on the support our smaller platforms really need. Smaller platforms are increasingly subject to criminal exploitation; however they struggle to moderate harmful content not due to a lack of enthusiasm, but a lack of resources and specialist expertise. We know from experience that smaller platforms are very receptive to mentoring and any opportunity to learn more about how we can minimise the terrorist and violent extremist threat online. If the UK Government wishes to tackle online harms – including terrorist content – effectively, we recommend they invest in similar programmes to support smaller platforms.
For more information on how we can support smaller platforms, see:
- The Terrorist Content Analytics Platform
- The Tech Against Terrorism Mentorship Programme
- The Tech Against Terrorism Pledge
- News (254)
- Counterterrorism (55)
- Analysis (52)
- Terrorism (39)
- Online Regulation (38)
- Violent Extremist (36)
- Press Release (35)
- Tech Responses (35)
- Regulation (33)
- Europe (31)
- Government Regulation (27)
- Academia (25)
- GIFCT (22)
- Reports (22)
- UK (22)
- US (20)
- USA (19)
- Guides (17)
- Law (16)
- UN (15)
- MENA (13)
- Asia (11)
- ISIS (11)
- Workshop (11)
- Presentation (10)
- Fintech (6)
- Far-right (5)
- Threat Intelligence (5)
- Webinar (5)
- Propaganda (3)
- Region (3)
- Submissions (3)
- Generative AI (2)
- Op-ed (1)
- Tech Against Terrorism Network (1)