5 min read

The Online Regulation Series | Germany

You can access the ORS Handbook here

Germany has an extensive framework for regulating online content, particularly with regards to hate speech and violent extremist and terrorist material. Experts also note that Germany’s regulatory framework has to some extent helped set the standard for the European, and possibly global, regulatory landscape.

Germany’s Regulatory Framework:

Main regulatory bodies:

  • The Voluntary Self-Regulation Multimedia Service Providers (FSM) is a self-regulatory body recognised by the NetzDG. The review panel consists of 50 lawyers, and tech companies can appeal to the FSM when they are unsure of the illegality of  content reported to them. Only social networks that are members of the FSM can do so.
  • As a general rule, the German government conscripts tech companies in remit of the law to carry out the requirements of the legislation set by the German government.

Main takeaways for tech platforms:

  • The NetzDG is one of the most extensive regulations of online content in the world. It requires tech companies to: 
    • Introduce an “effective and transparent complaint mechanism” for users to swiftly report criminally liable (under the German Criminal Code) content
    • Assess reported content’s illegality under German law and remove content quickly. Rules stipulate that once notified by users, a company shall remove “manifestly unlawful content” within 24 hours and other prohibited content within 7 days
    • Produce compulsory bi-annual transparency reports detailing how they respond to user reports
    • Pay fines of up to either 5 (for individual responsible for the complaints mechanism) or 50 million euros (for company itself) when failing to comply with the regulation
  • The April 2020 Bill adds further requirements to the NetzDG by compelling companies to:
    • Improve the quality of their transparency reporting, requiring tech companies to:
      • Provide information on counter-notification procedures
      • Detail the results of their use of automated methods for detecting illegal content
      • Clarify whether they have given access to their data to independent researchers
    • Facilitate reporting processes of illegal content
    • Strengthen appeal processes to allow users to challenge content removal decisions through a case-by-case review process
    • The April 2020 Bill also includes the February 2020 amendment, Gesetzentwurf zur Bekämpfung des Rechtsextremismus und der Hasskriminalität, which is currently on hold. This amendment would require companies to:
      • Provide the Federal German Police Force with private information of users posting illegal content
      • Prohibit tech companies from alerting users about the action taken for 14 days.

October 2017: The NetzDG

The NetzDG was introduced in 2017 to combat hate speech and target terrorist and extremist content, misinformation, and online speech that “may lead to hate crimes”. The NetzDG is aimed at large social media companies with over 2 million users.

When unveiled, the NetzDG was criticised by several civil society organisations, including  Article 19  and Human Rights Watch, as well as by David Kaye, the United Nations Special Rapporteur on Freedom of Expression. Kaye criticised the fact that it is now the legal responsibility of tech companies to adjudicate on the illegality of content, with little to no accountability from courts and public prosecutors. According to Kaye, since the German Criminal Code and tech platforms’ Terms of Service are different, tech companies are now responsible for following two sets of guidance on content moderation, without court orders or judicial review to assist them in determining content illegality.

Article 19, on their part, cautioned that the 24-hour removal deadline and high fines faced by platforms might make companies err on the side of removal. This could lead to the censoring of content that is neither extremist nor illegal in nature – what some have called “over-policing” of content – and poses serious questions with regards to potential negative impact on freedom of expression. Daphne Keller has pointed out that – whilst some argue that the NetzDG has not led to an increase in content removal – without confirmation of how companies have increased their Terms of Service removals as a precautionary action as a result of NetzDG, there is no way to adequately assess this.

February 2020: Gesetzentwurf zur Bekämpfung des Rechtsextremismus und der Hasskriminalität

Germany’s parliament passed an amendment to the NetzDG in February this year. The amendment aims to further regulate hate speech, cyberbullying, and extremist content that stems from violent far-right extremism through obliging tech companies to share the information of users that post illegal content to the German Federal Police Force.

The amendment follows three right-wing terrorist attacks in Germany; the 2019 Halle attack, the 2020 Hanau attack, and the 2019 murder of pro-immigration politician Walter Luebcke. Luebcke’s murder was highlighted by the German government when justifying the introduction of the amendment, stressing that his death as was preceded by him being targeted online hate speech.

The amendment underwent a review on 7 October 2020 and has been put on hold. German President Frank-Walter Steinmeier has held off from ratifying the amendment due to its potential unconstitutionality, mainly because of possible privacy violations. This mirrors civil society and legal concerns over the amendment, which consolidated around serious privacy concerns if social media platforms were to provide the government with users’ private information, without any judiciary oversight. At the time of writing, it is unclear what will happen to the amendment, but content regulation expert Matthias Kettemann has hypothesised that the amendment might be brought before parliament again or repealed altogether.

April 2020: further amendments to the NetzDG

The 2020 April draft bill, which is a collation of further amendments to the NetzDG, widens the scope of the NetzDG from social media platforms to VSPs, extends the requirements put on tech companies (see above), and includes the above-mentioned February 2020 amendment.

EuroISPA, a pan European association of European Internet Services Providers Associations (ISPAs), has raised concern over the German amendments being drafted before the EU Digital Services Act has been updated, as the DSA was still undergoing public consultation when the 2020 April amendments were drafted. EuroISPA cautioned that the NetzDG amendment and its implication for online regulation in Germany will limit legislative consistency across member states, which in turn will affect tech companies and VSPs as they need to respect individual member states’ online regulation rules, which limits new starters from entering the market. This is of particular concern in the April 2020 amendments, as this extends the scope of the NetzDG to VSPs of all sizes, as the amendment is meant to incorporate Germany’s obligations under the EU’s AVMSD (2018) into the NetzDG scheme. Whilst bigger companies might have the resources to comply with the NetzDG, smaller companies might struggle due to lack of capacity. Given that terrorists predominantly exploit smaller tech platforms for this very reason, this presents significant risks to competition and innovation.

Concerns over negative global impact

The NetzDG has been used as a template for regulatory frameworks in other countries, despite the significant critiques of the law. Several civil society groups have warned that the law may inspire similar or more restrictive regulation by less democratic nation states, which could further infringe on freedom of speech and digital rights globally.


[1] Due to the AVMSD being a European Union Directive, it is up to the individual member states to draft legislation that respects the obligations as set out in the European directives. Germany’s adoption is covered in the April 2020 legislation.

[2] In 2018, the EU updated its Audio-Visual Media Services Directive (AVMSD), which governs Union-wide coordination of national legislation on audio-visual services (such as television broadcasts), to include online video-sharing platforms (VSPs). It encourages Member States to ensure that VSPs under their jurisdiction comply with the requirements set out in the AVMSD, including preventing the dissemination of terrorist content.

Resources:

Article 19 (2017), Germany: Act to Improve Enforcement of the Law in Social Networks  

de Streel Alexandre et al (2020) Online Platform's Moderation of Illegal Content Online, Policy Department for Economic, Scientific and Quality of Life Policies – Directorate-General for Internal Policies.

Earp Madeline (2020). Germany Revisits Influential Internet Law as Amendment Raises Privacy Implications, Committee to Protect Journalists.

Echikson William  (2020), The Impact of the German NetzDG Law, CEPS Europe.

Kaye David (2019), Speech Police: The Global Struggle to Govern the Internet: Columbia Global Reports.  

Hardinghaus Alexander, Kimmich Romona & Schonhofen Sven (2020), German Government Introduces New Bill to Amend Germany's Hate Speech Act, Establishing New Requirements for Social Networks and Video-Sharing Platforms, Technology Law Dispatch, ReedSmith.

Heldt Amelie (2020), Germany is amending its Online Speech Act NetzDg... but Not Only That, Internet Policy Review.

Human Rights Watch (2018), Germany: Flawed Social Media Law

Lee Diana (2017). Germany's NetzDG and the Threat to Online Free Speech, Yale Law School, Media, Freedom and Information Access Clinic.

Lomas Natasha (2020). Germany Tightens Online Hate Speech Rules to Make Platforms Send Reports Straight to the Feds, Techcrunch.

Pielemeier  Jason (2019), NetzDG: A Key Test for the Regulation of Tech Companies, GNI.

Tworek Heidi and Leersen Paddy (2019), An Analysis of Germany's NetzDG Law. Transatlantic Working Group.