To read the full submission please see here.
Submitted: 15 September 2021
1. BACKGROUND
On 12 May 2021, the UK Department for Digital, Culture, Media & Sport published the draft Online Safety Bill (OSB), aiming to counter harmful content online and announced in a White Paper in April 2019. In July 2021, a Joint Committee on the draft was established by the House of Lords and House of Commons to consider the OSB and publish a report on its findings in December 2021. In late August-September, a call for written evidence to the draft OSB was published by the Joint Committee. The Joint Committee accepted written evidence on the draft OSB from July to September 2021.
2. TECH AGAINST TERRORISM’S CORE ARGUMENTS AND RECOMMENDATIONS
The arguments in our response to the consultation on the Online Safety Bill (“OSB”) may be summarised as follows:
1. The OSB lacks consideration for smaller platforms. Despite the OSB differentiating between different categories of service, the bill currently lacks precision on the threshold conditions for each category. To provide clarity for platforms affected by the requirements in the Bill, the OSB should include a taxonomy of platform size in its categorisation of services.
2. Imposing stringent legal requirements with no regard for platform size will harm the diversity and innovation that drives the tech sector. Stringent legal requirements will disproportionately impact smaller tech companies with fewer resources to support compliance, whereas larger tech platforms will be able to allocate the resources necessary to comply with the Bill.
3. The draft OSB undermines the rule of law and due process. It does so by outsourcing the responsibility of adjudicating what constitutes illegal content (including terrorist content) to tech companies without appropriate judicial oversight. Online counterterrorism efforts should be led by democratically accountable institutions and the OSB should reflect this.
4. The OSB lacks proper and operable definitions of what is considered “harmful online content” or of what might constitute terrorist content. The same goes for “democratic” and “journalistic” content, which are currently not definitions capable of operational effect, and risk being weaponised by terrorists and violent extremists. This may render the Bill incapable of achieving its purpose to make internet use in the UK the safest worldwide.
5. HMG should ensure that appropriate support mechanisms are created for platforms of all sizes and resources to ease compliance with the Bill. In particular, provisions in the Bill relating to risk assessments and the use of proportionate systems to counter illegal content will not be practicable for smaller platforms without support. HMG should continue supporting initiatives such as Tech Against Terrorism which provide policy and practical support to tech platforms in countering terrorist use of the internet.
To read the full submission please see here