Wikimedia Foundation Annual Plan/2024-2025/Goals/Safety & Integrity

Free access to trustworthy knowledge offered by the Wikimedia projects is more valuable than ever. In an era of escalating political and geopolitical conflict, coordinated disinformation campaigns, attacks against online freedoms, and widespread deployment of AI, the world needs Wikimedia.

Protect our people and projects

Strengthen the systems that provide safety for volunteers. Defend the integrity of our projects. Advance the environment for free knowledge.

All of our work to defend the safety of Wikimedia's people and the integrity of Wikimedia's projects is rooted in human rights principles, like our defense of freedom of expression and privacy, so that the projects can operate globally and equitably. The integrity of Wikimedia projects needs protection to resist external challenges and threats. Meanwhile, the laws governing online platforms continue to undergo an unprecedented amount of change globally. The Digital Service Act and the Online Safety Act are just two examples of an ongoing global trend. Amidst such unprecedented change, regulators will only protect and support Wikimedia people and projects to the extent that they understand the unique role Wikimedia projects play in advancing the public interest.

The work outlined under this year's Safety & Integrity goal responds to two key external trends:

  • The veracity of content is increasingly contested. Political and geopolitical conflicts are making it harder for editors and audiences to agree about facts, while AI is being weaponized to spread disinformation.
  • Regulation brings threats and opportunities that vary by jurisdiction.

As we also heard from volunteers in Talking: 2024, Wikimedia projects depend on people who are able to contribute while remaining safe, welcomed, and respected.

To provide for the safety of these volunteers and defend the integrity of Wikimedia projects, we will:

  1. Protect our People: Strengthen the policies and systems that provide for the safety of volunteers. We will do this by strengthening trust and safety on Wikimedia projects, protecting human rights when our projects are at risk, and building technology to prevent scaled abuse.
  2. Protect our Projects: Defend the integrity of our projects from attempts to limit access to knowledge. We will do this through legal defense and compliance, and countering disinformation.
  3. Advance our Model: Promote the value of Wikimedia's free knowledge model so that more people will help us protect it. We will do this by advancing a positive vision for internet policy that supports and protects the broader digital commons upon which Wikimedia depends.

Protect our people

edit

Maggie Dennis We will strengthen the policies and systems that provide for the safety of volunteers. We'll do this by supporting community self-governance, protecting human rights when our projects are at risk, and strengthening trust and safety on Wikimedia projects.

We plan to continue to strengthen our support for community self-governance programs that handle user conduct matters, ensure the Wikimedia projects uphold the principles in the Universal Code of Conduct, and operate according to standards that meet Wikimedia's Human Rights Policy.

Trust & Safety

edit

Jan Eissfeldt Our collaboration with communities in guiding our projects constructively forward is arguably our greatest strength. The Wikimedia Movement is based on the power and agility of crowdsourcing, enabling the projects to expand content and swiftly and transparently update information. Wikimedia's crowdsourced self-governance is also a key mechanism for trust & safety in our projects, allowing us to be far more nimble than a centralized staff-only model could ever be in not only keeping our sites legally compliant, but also in crafting and enforcing a truly global suite of governance policies that reflect the principles of the Wikimedia projects.

We support community self-governance by supporting committees largely or even often exclusively comprising volunteers. In this case, we refer to those committees consisting of members of our international body of volunteers who monitor content and address challenges in the behaviors of users on our sites. Their activities range from reviewing and resolving user disputes to evaluating administrator activity against our global policies to providing an appeal process for users subject to Foundation sanctions. They help bring in diversity of experience and perspectives and help ensure that such key workflows reflect our international Movement. Despite their commitment, these volunteers often face time constraints and challenges, including personal risks in some cases, particularly when their work involves curtailing abuse of our systems. We are here to help them.

Activities include:

  • Supporting the Case Review Committee, the Ombuds Commission, several NDA-holding Arbitration Committees, and the Stewards, as well as the likely newly seated Universal Code of Conduct Committee through liaising with staff as needed, training where appropriate, and administrative report tailored to the needs of the individual group;
  • Protecting administrators who are targeted by overzealous litigation;
  • Providing resources for content curators with a particular emphasis on site administrators to help keep them aware of the changing international landscape of legal concerns.
  • Working with volunteers to update and strengthen existing policies related to cross-project integrity.

Human rights

edit

Rebecca MacKinnon The Wikimedia movement both relies upon and enables human rights. Fundamentally, our projects enable every single person to seek, receive, and impart information as a critical human right. But the very process of curating, sharing, and contributing knowledge also has implications for fundamental rights centered around free expression, privacy, and equity, among others. We take seriously our role in supporting a safer environment in which information can be shared and received responsibly.

It is important to acknowledge that the Wikimedia Foundation does not and cannot stand alone in this work. We rely on our many volunteers who tirelessly scan the sites we host for material that violates the rights of others, whether that's removing unsourced material about human beings or content of unclear copyright status. We rely on a host of global partners who support our volunteers in times of danger, both offering them immediate relief and resources and advocating for their long-term good. We rely on many legislators and regulators and other policy makers who are committed to responsibly keeping information free for this generation and into the future.

Activities include:

  • Improving understanding of human rights in the context of Foundation work and Wikimedia projects to amplify the influence and impact we can have with decision makers, volunteers, and staff as key messengers for this work.
  • Strengthening the Foundation's ability to support and protect volunteers who face threats to their human rights in connection with their contributions to the projects;
  • Formalizing human rights due diligence processes so that staff and volunteers have the resources and knowledge frameworks necessary to identify potential risks to volunteers and others that may arise in the course of operating and governing Wikimedia projects;
  • Undertaking the first independent assessment of the Foundation's efforts to protect freedom of expression and privacy in relation to Wikimedia projects, as required as a member of the Global Network Initiative;
  • Reviewing and providing experimental support for technology that enhances volunteers' ability to safely contribute to the Wikimedia projects, as well as readers' ability to safely access all Wikimedia content, such as Wikimedia DNS.
  • Proactive outreach and communications around the work the Foundation is doing to protect the rights of readers and contributors to the Wikimedia projects to key audiences (public, movement, policymakers) to increase awareness and understanding.

Scaled Abuse

edit

Each goal in this year's annual plan is also supported by Product and Technology work.

The OKRs below detail that work for the Safety & Integrity goal:

Protect our projects

edit

Aeryn Palmer Defend the integrity of our projects from attempts to limit access to knowledge. We will do this through legal defense and compliance, and countering disinformation.

edit

Aeryn Palmer In the past year, multiple new or proposed laws have focused on hosts of online platforms, and the Foundation has invested in a multi-pronged approach consisting of: legal analysis and evaluation, regulator education, legal defense, and compliance where doing so is consistent with human rights standards. In FY 24-25, we will continue to execute on this strategy.

Laws that do not take the Wikimedia model into account can have harmful consequences for our projects and the people who build them—such as censorship, arrest of editors, or costs that may disrupt Foundation operations—regardless of whether the harms are intended. When faced with laws that may have these unintended consequences, we work to educate lawmakers and regulators about the potential harms. Outreach and communications also serve as important avenues to raise awareness and, in some cases, garner public support for our goals.

When a proposed law is passed, we conduct a detailed review to understand if it could apply to the projects and/or Foundation, assessing it under our Applicable Law and Human Rights policies. Using this analysis, we determine whether the Foundation—in some cases, working with the communities of contributors—may want to consider taking action in response to the law.If any action is required, we will create an appropriate plan for any change that is required.

Legal developments can occur sometimes unexpectedly throughout the year, so this list cannot be comprehensive. We provide here a few examples of how we approach this challenge.

Activities include:

  • Working, in partnership with the Wikimedia communities, for a substantive update and overhaul of our Privacy Policy, to be completed and go into effect in fiscal year 2025–26. We will reinforce our commitment to protecting user data on the projects, while supporting Foundation and community developers in creating high-quality tools to protect users and the free knowledge they contribute and share from abuse and vandalism.
  • Complying with applicable laws as appropriate. Here are a few key examples of work for next fiscal year:
    • Continuing our Digital Services Act (DSA) compliance program, in response to this European Union significant legislation that impacts online hosts. In particular, next fiscal year, we will improve and refine our DSA transparency reporting, and will complete our first required audit.
    • Evaluating the United Kingdom's Online Safety Act, and planning accordingly for any changes to Foundation operations that we assess as prudent. This law is one among many concerned with the online activities of children in particular, and we will look to our Child Rights Risk Assessment to guide future work that can increase safety on the Wikimedia projects.
  • This fiscal year, we expect a ruling in the US Supreme Court case Netchoice v. Paxton, which could significantly change the legal rules for hosting user-generated content and could present risks not only to the Foundation but to individual Wikipedians. Next fiscal year, we will work to defend the projects if the outcome gives rise to new lawsuits trying to remove content or harm users. We will also be looking at legislative options that may be advanced following the ruling and public facing communications activity that can help to reinforce our key messages.
  • Strengthening our collaboration with Wikimedia communities in some jurisdictions to advocate for needed reforms. In the jurisdictions with the most power to shape our projects, we will work with volunteers to advocate for legal reforms that better protect our projects and people from bad-faith litigation by people and entities seeking to advance their points of view or self-promoting narratives.
  • Expanding our program of strategic litigation to protect the Wikimedia projects, contributors, and model, in light of the changing regulatory environment. We will search for and evaluate opportunities to file amicus briefs in courts around the world, where the Foundation's voice may help local or international courts understand the impacts of the laws they interpret on Wikipedia. For example, we will look for useful test cases to help clarify and expand the public domain and to help ensure that defamation and right to be forgotten issues are clearly understood so that users know what types of content are safe to contribute to the projects in different places around the world. Where there is enough impact, we will consider opportunities in which the Foundation directly brings a legal action to support this work. In key jurisdictions, communications can help to amplify our key messages to key public and policy maker audiences.

Countering disinformation

edit

Rebecca MacKinnon The Foundation will continue working actively to strengthen our communities' ability to anticipate, prepare for, neutralize and ideally even prevent disinformation attacks.

Wikimedia projects are increasingly regarded as trusted sources of knowledge across the world. With an unprecedented number of elections worldwide in 2024 and rising geopolitical conflict, the truth is more contested than ever. There are examples of politicians and governments making deliberate efforts to discredit Wikipedia through disinformation campaigns in mainstream and social media. The open nature of the projects themselves make us vulnerable to information warfare, including disinformation (information shared with the intent to mislead).

Countering disinformation is closely connected to the broader work countering on-wiki misinformation (information that is inaccurate but not deliberately meant to mislead), part of the core work of volunteers. Supporting volunteers' work in improving information integrity of the wikis continues to be a key priority in 2024−2025.

Activities include:

  • Enhancements to investigations that identify disinformation on-wiki, in collaboration with volunteer functionaries and support from research and partnerships.
  • Strengthening and expanding moderator tools, including transparent and accountable use of machine learning in service of volunteers' efforts to identify and address disinformation.
  • Sharing information, tactics, and resources among functionaries and other key self-governance stakeholders and affiliates working to counter disinformation in their communities.
  • Improving policymakers' understanding of the Wikimedia model, and what types of laws and public policies can protect and support volunteers' ability to fight disinformation, in jurisdictions where laws have significant impact on the projects' operations.
  • Creating a new 2024 elections task force, to coordinate the work of the Foundation in countering disinformation during multiple global elections and supporting volunteers.
  • Strengthening relationships with external researchers who study online disinformation, thereby informing on-wiki disinformation tracking, improving access to a broader range of fact-checking resources, and improving access to external research about disinformation trends as they emerge on other platforms.
  • Public facing communications strategies will enable the Foundation to tell an effective story about the work happening to prevent mis and disinformation on the Wikimedia projects in key markets - India, EU and the US - during this critical election year.
  • Reinforcing awareness and value of our unique model, and how it provides an antidote to mis and disinformation to help enhance brand affinity and trust.

Advance our model

edit

Jan Gerlach We will promote the value of Wikimedia's free knowledge model so that more people will help us protect it. We will do this by advancing a positive vision for internet policy that supports and protects the broader digital commons upon which Wikimedia depends.

As part of our legal and policy advocacy work, we represent the interests of the Wikimedia communities publicly through a range of proactive outreach and communications activities in key jurisdictions.We will educate policymakers and other stakeholders who influence them (like media, academics, and civil society groups) about how Wikimedia's model works, and how the projects contribute positively to society. We look for ways to make the legal and regulatory environment of the movement more favorable for safe and inclusive contributions. We will advocate for a positive vision of internet policy that advances free knowledge in the context of our work. We advocate against government laws, regulations or behaviors that threaten the open internet upon which our communities depend. This advocacy is prioritized based on Wikimedia's opportunity to have impact and the relevance of the issue for Wikimedia's people and projects.

Results include:

  • We will have deepened and expanded partnerships with affiliates and volunteers to educate policymakers, governments, and other stakeholders about Wikimedia's model of community governance. This will help our external stakeholders understand how the Foundation supports the movement, and what types of laws and public policies can protect and promote volunteers' ability to share well-sourced information responsibly;
  • Support and coordination for affiliates and volunteers advocating for copyright reforms will be further strengthened to ensure that the law protects and supports the use of free and open licensing, and that governments make works and content freely available;
  • We have identified potential test cases for impact litigation to help clarify and expand the public domain and to help ensure that defamation and right to be forgotten issues are clearly understood, so that users know what types of content are safe to contribute to the Wikimedia projects in different places around the world.
  • Partnerships with researchers are broadened for the purpose of collecting and disseminating empirical, independent research of the impact of the Wikimedia projects.This research will be key to providing clear evidence that we can cite to policymakers and media of how the Wikimedia movement advances the public interest by contributing to the economic and social well being of people in particular communities and countries.
  • Our UN engagement strategy produces improved understanding by stakeholders across the UN system of how the Wikimedia projects contribute directly to the Sustainable Development Goals.
  • Policymakers in key international organizations and intergovernmental bodies recognize that Wikimedia projects are an important ingredient of digital public infrastructure that must be supported and protected.
  • Relationships with human rights and civil society organizations are strengthened. As a result of their improved understanding of Wikimedia's governance model and our communities' contributions to knowledge equity, more allies will vocally advocate for policy positions that protect Wikimedia projects as a public good.
  • Policymakers and policy influencers (media, civil society, etc) in countries whose laws have a direct impact on the projects are educated about how Wikimedia communities develop and deploy machine learning tools that support volunteers' efforts to improve the encyclopedic quality of the projects and fight disinformation;
  • Policymakers and policy influencers gain an improved understanding of the emerging relationship between free knowledge projects, including the Wikimedia projects, and generative AI; and how and why laws and regulations should protect Wikimedia's human-centric technology development and content governance model in service of the public interest.


EquityEffectiveness