You can access the ORS Handbook here
Canada’s approach to online regulation has, so far, been characterised by its support for tech sector self-regulation as opposed to government-led regulation of online content. However, concerns over foreign interference in Canadian politics and online hate speech and extremism, have led to public discussions considering the introduction of a legislation on harmful online content, and the possibility to make tech companies liable for content shared on their platforms.
Canada’s regulatory framework:
- National Strategy on Countering Radicalization to Violence, 2018, which summarises Canada’s approach to countering terrorism and violent extremism.
- Canada’s Communications Future: Time to Act (BTLR), January 2020, a broad review of the broadcasting and telecommunications legislation in Canada, drawing recommendations for the future of the legislative framework in the country, and calling for the introduction of social media regulation.
- Canada’s Digital Charter, 2019, which lays out Canada’s approach to internet technologies and the online space; with the 9th principle addressing the issue of violent extremism, and underlining that the online space should be “free from hate and violent extremism”.
- Digital Citizen Initiative, Canada’s strategy for the building “resilience against online disinformation and […] support a healthy information system”, focused on research and “citizen” activities.
- Canada is a signatory to the Christchurch Call to Action.
Main regulatory bodies:
- Canadian Radio-television and Telecommunications Commission, which oversees the regulation of internet services in the country.
- Public Safety Canada (Ministry of Public Safety and Emergency Preparedness) – the main federal body in charge of coordinating matters related to national security, safety and maintaining a peaceful society.
- Canada Centre for Community Engagement and Prevention of Violence, responsible for the National Strategy on Countering Radicalization to Violence.
- Innovation, Science and Economic Development Canada, which oversees different areas of Canada’s economic development, published the 2020 broadcasting and telecommunications legislative review.
- Canadian Heritage, which oversees the Digital Citizen Initiative.
- In December 2019, newly appointed Minister of Canadian Heritage Steven Guilbeault was tasked by Prime Minister Justin Trudeau in a Mandate Letter to develop a new regulatory framework for social media, “starting with a requirement that all platforms remove illegal content, including hate speech, within 24 hours or face significant penalties. This should include other online harms such as radicalization, incitement to violence, exploitation of children, or creation or distribution of terrorist propaganda.”
Key takeaways for tech platforms:
- Tech platforms are exempt from liability for user-generated content.
- Canada has favoured a self-regulatory approach to moderation of online content and speech, engaging in cross-sector initiatives to support the tech sector in countering terrorist and violence extremist use of the internet.
- The Canada’s Communications Future: Time to Act (2020), known as BTLR, offers a blueprint for regulating online content in the country, calling for tech companies to be held liable for harmful content on their platforms.[1]
Support for self-regulation and cross-sector initiatives
Both the Digital Charter and National Strategyto counter radicalisation stresses that citizens should be able to “fully participate in the online spaces” without viewing harmful and extremist content. Further, Canada’s framework to counter terrorism and extremism has comprehensively integrated the need to tackle terrorist use of the internet. The 2018 National Strategy lays out the principle of Canada’s approach to extremist content online, which is based on a “multi-stakeholder approach that includes national and international engagement with technology companies, academic researchers and civil society.”
This has led Canada to focus on digital literacy and counter narratives efforts and on supporting research efforts to better comprehend the terrorist and violent extremist online landscape in the country. Most of these initiatives are funded via the Community Resilience Fund.
Supporting innovative tools for a swifter identification of terrorist content
Following Canada’s signing of the Christchurch Call to Action, Public Safety Canada announced that it would award a grant to Tech Against Terrorism to develop the Terrorist Content Analytics Platform (TCAP). The TCAP will be the world’s largest database of verified terrorist content, aimed at supporting tech companies in swiftly identifying terrorist content uploaded on their platforms, and will inform quantitative research on terrorist use of the internet.
Canada’s support for the TCAP demonstrates the country’s acknowledgment of the difficulties faced by small and micro tech companies in tackling terrorist exploitation, and of its willingness to support content moderation via innovative tools.
BTLR: towards regulation of online content and speech?
Concerns about online foreign interference in Canadian’s politics and elections has led to calls for regulating online content, especially on social media. Former Minister of Democratic Institutions, Karina Gould, called for such regulation, arguing that tech platforms were demonstrating a “lack of willingness” from tech companies to address the issue.
In addition, concerns for the future of the Canadian digital space, including extremist and harmful content, has led Canada to consider regulatory approaches. In June 2018, the government commissioned a legal review of the communication legislative framework, which resulted in the Canada’s Communications Future: Time to Act report. The report highlights concerns related to the spread of harmful content and extremist views online. Mainly, it recommends the introduction of a “legislation with respect to liability of digital providers for harmful content and conduct using digital technologies.” Such legislation would aim to counter the spread and amplification of “harmful content” (a term which remains undefined in the report) online.
With regards to illegal content – including terrorist content – the BTLR recommends that the Canadian government introduces regular reviews of tech platforms’ monitoring and removing mechanisms for “illegal content and conduct found online”.
Further, the BTLR also recommends the establishment of a registration system for tech companies operating in Canada, which would bring all media content providers under a newly formed “Canadian Communications Commission”. Registered companies would then have to “provide such information as the CRTC [Communications Commissions] may specify, “ and will be obliged to support the diffusion of Canadian content. This registration would differentiate between different type of media providers: content curation, such as Netflix or Spotify; content sharing, such as Facebook and YouTube, and content aggregation for media disseminating content from curators, which would mostly apply to traditional media broadcast services.
Whilst many of the regulatory proposals to regulate tech platforms and online content are still in an early phase, online regulation in Canada is likely to undergo major changes and to see the embedding of the principle of legal liability for user-generated content for tech platforms in the Canadian online landscape.
[1] At the time of writing, there are still uncertainties about whether the recommendations made in the BTLR are to become laws in Canada.
Resources:
Austen Ian (2019), “Canada Joins the World in a Social Media Crackdown”, The New York Times
Baker McKenzie (2018), “Government of Canada Looks to Modernize Telecommunications and Broadcasting Legislation for the Digital Age”, Lexology
Boutilier Alex, Oved Marco C., Silverman Craig, and L. Jane (2019, updated in 2020), “Canadian government says it’s considering regulating Facebook and other social media giants”, The Hamilton Spectator
Jeftovic Marc E. (2020), “Canada’s BTLR is a blueprint for regulating internet content”, Easydnscom
Canada Centre for Community Engagement and Prevention of Violence (2018), “National Strategy on Countering Radicalization to Violence”
Government of Canada (2019), “Canada Declaration on Electoral Integrity Online”
The Guardian (2019), “Canada may regulate social media companies to avoid election meddling”
Innovation, Science and Economic Development Canada (2019), “Canada’s Digital Charter in Action: A Plan by Canadians, for Canadians”
Innovation, Science and Economic Development Canada (2019), Canada’s Communications Future: time to act” Broadcasting and Telecommunications Legislative Review
Library of Congress, “Government Responses to Disinformation on Social Media Platforms: Canada”
OpenMedia (2020), “The BTL…What? What is the BTLR report and what it means for the future of our Internet”
Public Safety Canada (2019a), “Government of Canada Announces Initiatives to Address Violent Extremist and Terrorist Content Online”
Public Safety Canada (2019b), “Government of Canada Announces Initiatives to Address Violent Extremist and Terrorist Content Online”
- News (254)
- Counterterrorism (55)
- Analysis (52)
- Terrorism (39)
- Online Regulation (38)
- Violent Extremist (36)
- Press Release (35)
- Tech Responses (35)
- Regulation (33)
- Europe (31)
- Government Regulation (27)
- Academia (25)
- GIFCT (22)
- Reports (22)
- UK (22)
- US (20)
- USA (19)
- Guides (17)
- Law (16)
- UN (15)
- MENA (13)
- Asia (11)
- ISIS (11)
- Workshop (11)
- Presentation (10)
- Fintech (6)
- Far-right (5)
- Threat Intelligence (5)
- Webinar (5)
- Propaganda (3)
- Region (3)
- Submissions (3)
- Generative AI (2)
- Op-ed (1)
- Tech Against Terrorism Network (1)