Wikimedia Foundation Legal department/FAQ On Countering Terrorist and Violent Extremist Content on Wikimedia Projects

This page answers questions about Wikimedia Foundation procedures and guidelines concerning terrorist and violent extremist content.

  1. What is the purpose behind this update?
    Over the last two decades, the use of the Internet and online platforms to facilitate violent extremist activities has become a cause of concern for various governments as well as online platforms. Online platforms, on their own accord, have taken several voluntary measures, such as establishing a self-regulation code of conduct and forming multi-stakeholder groups to address the unfettered distribution of Terrorist and Violent Extremist Content (TVEC/Terror Content).
    Similarly, several countries have noted this concern and are exploring legislative solutions to address the online dissemination of terrorist and violent extremist content. In May 2021, the European Union promulgated Regulation (EU) 2021/784, also known as the European Union Anti-Terrorist Regulation (TCO, also known as TERREG), to address the misuse of hosting service providers to prevent the spread of TVEC. TCO was promulgated in May 2021 and comes into effect on June 7, 2022.
  2. What impact will it have on the Wikimedia Foundation’s Community?
    As a hosting service provider, the Wikimedia Foundation (Foundation/Wikimedia) is responsible for ensuring that its projects are not misused for promoting Terror Content. As a service provider with a global audience base, the Foundation, in certain circumstances, is required to comply with legal requests for addressing Terror Content. While the existing policies, including the Terms of Use and Wikimedia: Neutral Point of View, are well equipped to address concerns regarding Terror Content on projects, the Foundation has established this protocol as a minor update to address the rare cases legally requiring immediate action. The Foundation remains committed to the cause of safeguarding community rights, including the right to freedom of speech and expression and the right to access to knowledge, while also ensuring that the projects are not misused for disseminating Terror Content.
  3. Are there any specific regulations that address Terror Content on online platforms? Further, do they prescribe any specific obligations for hosting service providers?
    As mentioned above, the European Union’s TCO comes into effect on June 7, 2022. Amongst other obligations, all hosting service providers, upon receiving a “removal order” from a designated authority of an EU Member State, are required to remove “terrorist content” and/or disable access across all member states as soon as possible and in any event, within one hour of receipt of the order. Further, TCO requires hosting service providers to inform designated authorities once they note any terrorist content involving an imminent threat to life.
  4. Will the Foundation now moderate Terror Content on Wikimedia Projects?
    At present, the Foundation’s role is limited to reviewing and complying with valid legal removal orders from designated government authorities. That said, if the Foundation is of the view that the removal order has been issued in error, it shall take necessary actions to challenge the orders before the relevant authority.
  5. What constitutes Terrorist and Violent Extremist Content (TVEC)/Terror Content?
    While there is no universally accepted definition of “Terrorist Content,” different stakeholders, including governments, civil societies, and organizations in the online ecosystem, each present their own definitions. For example, TCO defines “Terrorist Content” to include content that:
    • incites others to commit terrorist offenses (by glorifying terrorist acts, advocating the commission of such offenses, etc.);
    • solicits others to commit or to contribute to the commission of terrorist offenses;
    • solicits others to participate in the activities of a terrorist group (see next question);
    • provides instruction on the making or use of explosives, firearms, or other weapons or noxious or hazardous substances, or on other specific methods or techniques for the purpose of committing or contributing to the commission of terrorist offenses; and/or
    • constitutes a threat to commit one of the terrorist offenses.
    TCO makes exceptions for content disseminated to the public for educational, journalistic, artistic, or research purposes. Additionally, the regulation specifically allows the expression of polemic or controversial views in the course of public debate. Most of the contributions available on the Foundation’s projects would likely fall within one of these exceptions.
    However, the Foundation will consider these different stakeholders’ definitions and interpretations as guiding principles during its review of removal orders, each of which will be considered on a “case to case” basis.
  6. Does the Foundation plan to maintain a list of Terrorist Individuals and/or Terrorist Organizations for the purposes of this update?
    No. When reviewing a TCO or TVEC related removal order, the Foundation may consider individuals and groups recognized as terrorist organizations by the United States Department of State as Foreign Terrorist Organizations or Specially Designated Global Terrorists (SDGTs), United Nations’ Office of Counter-Terrorism, and/or the European Council as a relevant factor to analysis.
  7. Does that mean that all content about Terrorists, Terrorism and Violent Extremist Acts run the risk of removal?
    No. Just because content pertains to the abovementioned topics, it will not warrant a legal review or removal. Terror Content published for the purposes of research, journalism, public discussions to counter-terrorism, journalism, and/or raising awareness against terrorism will not run the risk of removal. If, during review, the legal team finds that the content in question is relevant for any of the above-mentioned reasons, it will consider challenging the order(s) to allow for the continued hosting of such content.
  8. Will I be informed if my contribution to Wikimedia Projects is categorized as Terror Content and taken down in response to a legal order?
    Unless required otherwise under the legal order, when following through on a removal order from the designated authority of an EU Member State, we will reach out to inform the contributor that the content has been removed based on that removal order.
  9. What should we do if we encounter Terror Content, or problematic editors, on our projects?
    Currently, the applicable law does not require the community members or the Foundation to actively search for Terror Content online. However, if you do come across some content that you perceive problematic and which, in your reasonable opinion, either involves an imminent threat to life or a potential terrorist offense, or if it satisfies one or more of the categories for Terror Content outlined above, you may consider taking the following measures:
    • For content that involves an imminent threat to life or a potential terrorist offense, use standard practice and tools to block the content (revert, file deletion, suppression, etc.) and report the activity to the Trust and Safety (emergency(_AT_)wikimedia.org) so that they may inform applicable law enforcement authorities. When removing content, follow the standard community process, i.e., detail the reason for removing the content and inform the author of its removal.
    • For content that satisfies one or more of the Terror Content categories above, please report that content to the Trust and Safety team (emergency(_AT_)wikimedia.org) for further evaluation. If the content should be removed, use standard practice and tools (revert, file deletion, suppression, etc.) and follow standard community processes by detailing the reason for removal and informing the content author.
    If you have questions about the method by which content should be removed, feel free to reach out to the Trust and Safety team (ca(_AT_)wikimedia.org).
  10. What if a removal order infringes on volunteers’ freedom of expression and information?
    The Foundation is committed to protecting user content and, at the same time, ensuring that the projects are not misused for disseminating Terror Content. To this effect, all removal orders will undergo review by a Foundation’s in-house counsel, who will initiate appropriate measures whenever the content requested for removal appears too vague or ambiguous to constitute Terror Content under the applicable law. Under this process, the Foundation will ensure that content published for educational, informative, artistic, or research purposes should not be taken down, including content that some may consider radical or controversial regarding a sensitive/critical political issue.
  11. Will the Foundation preserve copies of content that has been removed in response to a legal order?
    The Foundation may preserve the Terror Content for a limited period or for a period prescribed under the applicable law for a specific purpose, such as for administrative or judicial review proceedings or assisting legal authorities in investigating any terrorist offenses. Preservation of removed content also ensures that this content can be reinstated in the event complaint proceedings indicate it was erroneously removed or disabled.
  12. What steps are the Foundation taking to ensure transparency regarding TVEC takedowns?
    Our annual transparency report will also provide information regarding our efforts to address Terror Content on projects in accordance with the obligations prescribed under the applicable laws, including TCO.
    The Foundation continues to explore methods to track removed Terror Content that best maintain our value of transparency surrounding content removal. Unlike other content that we make available after removal, such as material related to DMCA compliance (transparently logged here on Commons and here on Foundation), Terror Content may fall under a more sensitive category and thus, require a custom approach.
  13. What about partial content removal, like re-writing or altering content, instead of outright removal or deletion to comply with an order?
    There may be scenarios where certain content must be removed to comply with a legal order, but then we can subsequently work with volunteer editors to identify a legally permissible rewrite. These specifics, however, lie outside the scope of the immediate response policy and are best handled on a case-by-case basis. If you have questions about rewriting content, reach out to legal(_AT_)wikimedia.org.
  14. What if a volunteer restores content removed under a TCO removal order?
    If you feel strongly about restoring content removed under a TCO removal order, we encourage you to revise said content to ensure it does not fit one of the categories above.
    It is possible that community volunteers could put themselves at legal risk if they restore the exact same content removed under a TCO removal order.
  15. How are other hosting service providers addressing these concerns?
    Several hosting providers, including social networking platforms, have put policies in place to address violent extremist content regarding such individuals and organizations on their platforms. However, it is unclear if they held community discussions or took their users’ concerns into consideration while establishing these policies.
    The Foundation is committed to ensuring that its volunteers are kept well-informed about such updates, and that their concerns (if any) are addressed effectively.
  16. Will it be possible to include Community oversight/audit options for the removed content?
    Yes, community oversighters or suppressors, who have entered into non-disclosure agreements with the Foundation, may access Terror Content removed in response to a legal order.