Skip to main content
Home
Home

European Commission Publishes Proposed Regulation Governing Online “Terrorist Content”

European Commission Publishes Proposed Regulation Governing Online “Terrorist Content”

The European Commission recently published a draft "Regulation on preventing the dissemination of terrorist content online." If enacted, the draft Regulation would impose stringent monitoring, removal and reporting requirements for "hosting service providers," a term that is defined to include virtually all providers who host user-generated content.

1. The Draft Regulation: An Overview

The draft Regulation would impose several new legal duties on service providers, all intended to deter the dissemination of online "terrorist content" and require providers to take swift action when such content appears on their services. The draft Regulation would also provide for harsh penalties for noncompliance.

The key elements of the draft Regulation are:

  • Broad jurisdictional reach. The draft Regulation applies to all providers "offering services in" the European Union, no matter where in the world those providers are located. That includes providers with a "significant number of users in one or more Member States," even if those providers do not target their activities toward those users. Art. 2(3)(b).
  • Broad definition of "terrorist content." As explained below, the draft Regulation imposes several new legal duties regarding "terrorist content," which is defined to mean:
    • "inciting or advocating, including by glorifying, the commission of terrorist offences, thereby causing a danger that such acts be committed";
    • "encouraging the contribution to terrorist offences";
    • "promoting the activities of a terrorist group, in particular by encouraging the participation in or support to a terrorist group"; or
    • "instructing on methods or techniques for the purpose of committing terrorist offences."

Art. 2(5). Notably, the draft Regulation's definition of "terrorist content" is both novel and broader than similar definitions, including the United Nation's definition of "incitement to terrorism." (Practice 8 at 16).

  • New legal duty to prevent dissemination of terrorist content. The draft Regulation imposes on providers a new duty of care to prevent the dissemination of terrorist content. Specifically, providers must "take appropriate, reasonable and proportionate actions . . . against the dissemination of terrorist content and to protect users from terrorist content. In doing so, they shall act in a diligent, proportionate and non-discriminatory manner, and with due regard to the fundamental rights of the users and take into account the fundamental importance of the freedom of expression and information in an open and democratic society." Art. 3(1). Relatedly, providers must include (and enforce) provisions in their terms of service that bar terrorist content. Art. 3(2).
  • New legal duty to report terrorist offenses. The draft Regulation requires providers to "promptly inform" authorities when they "become aware of any evidence of terrorist offences." Art. 13.
  • Broad enforcement powers for designated authorities, including one-hour removal orders. Each Member State must designate a competent authority to enforce the Regulation. Competent authorities may:
    • Issue "removal orders" under which providers must remove or disable access to terrorist content within one hour of receiving such orders (Art. 4);
    • Issue "referrals" under which providers must "assess the content identified in the referral against [their] own terms and conditions and decide whether to remove that content or to disable access to it" (Art. 5); and
    • Order providers to implement "specific additional necessary and proportionate proactive measures" to identify and remove terrorist content (Art. 6).
  • Preservation requirements. In most cases, providers must preserve removed content for six months. Art. 7.
  • Transparency requirements. Providers must "set out in their terms and conditions their polic[ies] to prevent the dissemination of terrorist content, including, where appropriate, a meaningful explanation of the functioning of proactive measures including the use of automated tools." Art. 8(1). Providers must also publish annual "transparency reports" that explain "action[s] taken against the dissemination of terrorist content." Art. 8(2)-(3).
  • Safeguards for use of automated tools. Providers that use automated tools for detecting and removing terrorist content must "provide effective and appropriate safeguards," including human review, to ensure that removal decisions are "accurate and well-founded." Art. 9.
  • Complaint mechanisms. Providers must "establish effective and accessible mechanisms" allowing users whose content is removed to appeal those removals. Art. 10. (Users cannot ask providers to reverse removals made in response to removal orders.) Providers must also explain to users why their content has been removed. Art. 11.
  • Potentially harsh penalties. Member States and their competent authorities have discretion to determine penalties for noncompliance. However, providers who "systematically" fail to comply with the removal order process must be "subject to financial penalties of up to 4% of . . . global turnover of the last business year." Art. 18.

2. History and Background

The draft Regulation is the culmination of years of efforts by policymakers to restrict online terrorist content. The United Nations Security Council's 2005 Resolution 1624 called on Member States to adopt measures necessary to prohibit and prevent incitement to commit a terrorist act (i.e., "terrorist incitement") and to deny safe haven to persons suspected of committing terrorist incitement. The Resolution encouraged Member States to implement these measures in a manner that "prevent[s] the indiscriminate targeting of different religions and cultures" and complies with "obligations under international law, in particular human rights law, refugee law, and humanitarian law."

Since 2005, European agencies and lawmakers have taken various steps to implement Resolution 1624, including engaging in public consultation, developing a code of conduct, and creating an internet referral unit for countering illegal content online. The draft Regulation is intended to harmonize these efforts and standards.

3. What's Next?

The draft Regulation will likely be voted on before the EU's parliamentary elections in May 2019. We can, therefore, expect significant debate and activity around the draft Regulation in the coming months.

Perkins Coie attorneys are following this legislation closely and analyzing whether and to what extent it potentially conflicts with U.S. law, including the First Amendment to the U.S. Constitution, the Communications Decency Act, and the Electronic Communications Privacy Act.

Clients with questions about the draft Regulation, including how it may affect content moderation and enforcement efforts, should contact experienced counsel.

© 2018 Perkins Coie LLP

Related insights

Home
Jump back to top