Anti-Disinformation Repository
Hello and welcome to the anti-disinformation initiatives and tools repository!
Wikimedia communities have been working for years to promote trustworthy information and to act against disinformation. This work is multifaceted and takes many shapes: some communities develop lists of reliable or unreliable sources; others organized media literacy trainings; some create bots or machine learning software to automatically detect harassment, use of bad sources, or article quality; and, other volunteers work collectively to develop WikiProjects that ensure accurate information was shared on complex topics such as COVID-19 or climate change.
There is a vast number of activities, projects, and expertise related to countering disinformation within the Wikimedia communities. However, these projects deserve to be more widely known and publicized across the communities themselves and beyond them.
The Global Advocacy team at the Wikimedia Foundation launched a public mapping to collect anti-disinformation initiatives and tools developed at the local level across Wikimedia projects and create a repository of projects—confident that volunteers and affiliates can use it to better understand the movement's resources against disinformation, easily access and implement these to strengthen their own work addressing this challenge, and get ideas on how to continue curbing false and misleading information worldwide.
We know that this is only a partial list of the fantastic work and efforts that the Wikimedia communities as well as external organizations contribute to the encyclopedic work done on Wikipedia and the other projects. For that reason, please let us know what we missed and also what you think by commenting and discussing on the Talk page! You can also reach out to the Foundation’s Anti-Disinformation Strategy Lead, Costanza Sciubba Caniglia (csciubbacaniglia@wikimedia.org), and/or the Global Advocacy team (globaladvocacy@wikimedia.org). Together we can continue to make sure that Wikimedia acts as an antidote to disinformation!
If you have additional resources you would like to include, please send them to us before modifying this page directly. We are tracking existing resources and working on additional tools, and want to make sure that we include all the projects.
Lastly, if you want to find out more information about this repository, please read our Diff blog post.
Anti-Disinformation Repository (Sortable Table)
editProject | Description | Region | Type | Group | Year | Wikimedia project |
---|---|---|---|---|---|---|
¿Cómo funciona Wikipedia? 10 datos fundamentales para entenderla y sus principales desafíos / How does Wikipedia work? 10 fundamental facts to understand it and its main challenges | Document with main points about how Wikipedia works, challenges of the platform and how information is built collaboratively. | Global | Document | Wikimedia Argentina | 2022 | Wikipedia |
¡Que no te pille la tecnología! Personas Mayores Informadas / Don't let technology catch you! Senior Citizens Informed | This video was created as part of a campaign to empower the senior population to avoid disinformation when using the internet. Participants were volunteer attendees of the workshop "Senior people and digital environments: tools for integration", organized by Wikimedia Chile, Google Chile, and the Quilicura municipality in Santiago, who shared their own experiences on how they use online information. | Latin America & Caribbean | Video | Wikimedia Chile | 2022 | N/A |
1 Lib 1 Ref (#1Lib1Ref) | Twice per year, #1Lib1Ref—an abbreviation for “one librarian, one reference”—calls on librarians around the world, and anyone who has a passion for free knowledge, to add missing references to articles on Wikipedia. By increasing the use of reliable sources, the project improves the accuracy of the information.[1] | Global | Edit-a-thon | Wikimedia Foundation | 2016 (according to Talk page) | Wikipedia |
ARTT (Analysis and Response Toolkit for Trust) | Funded by the National Science Foundation’s Convergence Accelerator, the ARTT project aims to support trust-building online conversations. The main tool of the project, the ARTT Guide, is a web-based software assistant that provides a framework of possible responses for everyday conversations around tricky topics. By bringing together insights about how to engage in positive exchanges around credible information, and by providing guidance, encouragement, and inspiration, the ARTT Guide will help our users answer the question: “What do I say (and how do I say it)?” | North America | Tool | National Science Foundation’s Convergence Accelerator, Hacks/Hackers, and the Paul G. Allen School of Computer Science & Engineering, University of Washington.
For more information, please read footnote [1]. |
2021 | N/A |
Ciclo de Charlas "Diálogos Urgentes" / "Urgent Dialogues" Talk Series | Webinar about disinformation, Wikimedia projects, and collaborative coverage from a human rights perspective. | Global | Podcast | Wikimedia Argentina | 2020 | Wikipedia / Wikimedia Commons / Wikidata |
Citation Hunt | Citation Hunt is a tool for browsing snippets of Wikipedia articles that lack citations. It is available in several languages and allows browsing snippets either randomly or by their article's categories in a quick and fun way.[2] | N/A | Tool | N/A | 2017 | Wikipedia |
Cite Unseen | Cite Unseen is a user script that adds categorical icons to Wikipedia citations, such as whether the citation counts as news, opinion, blog, state-run media, a published book, among others. These categorical icons provide readers and editors with a quick initial evaluation of citations at a glance. This helps guide users on the nature and reliability of sources, and to help identify sources that may potentially be problematic or should be used with caution.[3] | North America | Tool | WikiCred | 2019 | Wikipedia |
CiteWatch | This is a bot that compiles lists of unreliable sources which are cited by Wikipedians via a specific citation template, based on several lists of unreliable or questionable publications (such as those compiled by Jeffrey Beall, the University Grants Commission, Quackwatch, or Wikipedia editors.[4] | N/A | Tool | N/A | 2018 | Wikipedia |
Completude, consistência e correção em bases de dados digitais sobre mortos e desaparecidos políticos na ditadura civil-militar brasileira / Completeness, consistency, and correction in digital databases about victims and political desaparecidos during the Brazilian civic-military dictatorship | This project is a research focused on improving content on those who were killed or disappeared during the military dictatorship in Brazil (1964–1985). In the context in which the federal government was shutting down websites with official content on human rights violations during the military regime in Brazil, Wiki Movimento Brasil and partners focused on bringing all this information into Wikidata and Wikipedia. | Latin America & Caribbean | Research | Wiki Movimento Brasil | 2021 | Wikidata / Wikipedia |
Covid-19 et désinformacion / COVID-19 and disinformation | This is a research project on COVID-19 disinformation on Wikipedia. | Northern & Western Europe | Research | Wikimedia France | 2020 | Wikipedia |
Croatian Wikipedia Disinformation Assessment | This report evaluates a case of long-term systemic disinformation and project capture on the Croatian-language Wikipedia. The report was produced by an external expert, who conducted a thorough analysis of the content and community dynamics on the Croatian language Wikipedia. It revealed a widespread pattern of manipulative behavior and abuse of power by an ideologically aligned group of admins and volunteer editors. The views and opinions expressed in this report are those of the author and do not necessarily reflect the official policy or position of the Wikimedia Foundation. | Central & Eastern Europe / Central Asia | Research | Wikimedia Foundation | 2021 | Wikipedia |
CTRL-F | Wikimedia Chile, Together with CIVIX, CIPER Chile Journalistic Research Center, and Fast Check organizes the CTRL-F program. A training program for students to teach them verification and media literacy techniques. | Latin America & Caribbean | Training | Wikimedia Chile | 2023 | N/A |
Decálogos para la lucha contra la desinformación en Wikipedia / Decalogues for the fight against disinformation on Wikipedia | This is a training on misinformation and access to knowledge, in collaboration with Medialab-Prado. The event touched upon knowledge and strategies to fight disinformation on Wikipedia. It was directed to journalists, people who are dedicated to scientific research, activists, and Wikimedians. | Northern & Western Europe | Training | Wikimedia España | 2020 | Wikipedia |
Disinfo 101 | An anti-disinformation training resource created by the Wikimedia Foundation’s Trust and Safety Disinformation team. It provides a basic introduction to terminology associated with disinformation; teaches you how disinformation typically functions within the Wiki ecosystem; and, provides a basic toolkit for dealing with disinformation on Wikimedia projects. | Global | Training | Wikimedia Foundation | 2023 | All projects |
Flagged Revisions | FlaggedRevs (Flagged Revisions) is an extension of MediaWiki software that allows one to flag versions of articles and thus give additional information about quality. This comes with the possibility of changing what an unregistered user sees by default. The technical description can be found at mw:Extension:FlaggedRevs. This makes it easier to increase the article quality and trustworthiness.[5] | N/A | Tool | N/A | 2014/2017 (A 2017 moratorium established that no new Wikis can be added but those that were already active can maintain it) | MediaWiki |
HRVVMC Wikipedia Edit-a-thons | The Human Rights Violations Victims’ Memorial Commission (HRVVMC) is a government agency tasked with documenting the human rights violations that occurred during the dictator’s regime and to establish a museum and a roll of victims. WikiSocPH worked with HRVVMC to help design training modules to train volunteers on how to edit Wikipedia articles related to the Marcos family and their regime. The Commission has conducted Wikipedia edit-a-thons to commemorate the initial declaration of martial law in 1972 and the Revolution in 1986 that ended the dictatorship. | East, Southeast Asia, & Pacific | Edit-a-thon | WikiSocPH | 2020–2022 | Wikipedia |
I Encuentro sobre lucha contra la desinformación en Wikipedia / 1st Meeting on the struggle against disinformation on Wikipedia | This is a roundatble discussion with experts on the topic of disinformation on Wikipedia, organized by Wikimedia España in 2020. | Northern & Western Europe | Event | Wikimedia España | 2020 | Wikipedia |
International Science Council and Wikimedia Foundation webinar series: Exploring the opportunities and challenges of Wikipedia for science | The Wikimedia Foundation organized a series of webinars in collaboration with the International Science Council, in order to explain the Wikimedia model and how it contributes to creating trustworthy information. The first webinar, Webinar 1: Managing Knowledge Integrity on Information Platforms was on March 16, 2023 and discussed how Wikipedia is a model to model for safeguard the provenance of scientific information online. Speakers were Diego Sáez Trumper (Wikimedia Foundation), Connie Moon Sheat (Hacks/Hackers), and Amalia Toledo (Wikimedia Foundation). The second webinar, Webinar 2: Building Special Projects on Wikipedia: The Covid Case Study, was on March 30, 2023 and showcased the work of the community, and in particular on the COVID-19 pandemic. The speaker was Netha Hussain, a volunteer who created the WikiProject COVID-19. | Northern & Western Europe | Event | Wikimedia Foundation | 2023 | |
Introdução ao Jornalismo Científico / Introduction to Scientific Journalism | This project relied on a partnership with a research lab, NeuroMat, and is a resource for young professionals to specialize in science journalism. Modules 3 and 6 are focused on how misinformation is spread in Brazil. | Latin America & Caribbean | Training | Wiki Movimento Brasil | 2017 | Wikiversity |
Knowledge integrity, Wikimedia Research | The Wikimedia Foundation’s Research team published a set of white papers that outline plans and priorities for the next 5 years. These white papers, which were developed collaboratively by all members of the team, reflect our thinking about the kind of research necessary to further the 2030 Wikimedia Strategic Direction of Knowledge Equity and Knowledge as a Service.[6] | Global | Research | Wikimedia Foundation | 2019- | N/A |
Laboratorio Wikimedia de verificación de datos / Wikimedia workshop on data verification | This is a training called Wikimedia fact-checking lab: learning and collaboration around Wikimedia projects whose goal is to investigate information and data verification to defend Wikipedia from misinformation. | Northern & Western Europe | Training | Wikimedia España | 2021 | Wikipedia |
Machine Learning models, Wikimedia Research | The Wikimedia Foundation’s Research team has built machine learning models to assist patrollers in detecting revisions that might be reverted, independently of whether they were made in good faith or with the intention of creating damage. Previous models were language-specific solutions (e.g., ORES), which are difficult to escalate and maintain, and are just available in certain languages. In contrast, this new generation of machine learning models are based on language-agnostic features, making it possible to use them for any existing Wikipedia, and for new language projects that can appear in the future. | Global | Tool | Wikimedia Foundation | 2022 | Wikipedia |
masz – MediaWiki | masz (the z is silent) is a tool that helps Checkusers (a small group of WP users who work on spotting fake accounts) see similarity of style and language in talk pages to find users with multiple accounts.[7] | N/A | Tool | Wikimedia Foundation | 2021 | Wikipedia / Wikidata / Wikinews |
Moderator Tools | The Moderator Tools team is a Wikimedia Foundation product team working on content moderation tool needs.[8] | N/A | Tool | Wikimedia Foundation | 2021- | Wikipedia |
New Page Patrol Source Guide | This page centralizes information about reliable sources for new page reviewers to use when they review new articles. It is intended as a supplement to the reliable sources noticeboard and List of Perennial Sources, to help page reviewers unfamiliar with a given subject assess notability and neutrality of an article.[9] | N/A | Wikipedia Page | N/A | 2019 | Wikipedia |
Observatoire de sources / Sources observatory | This is a list of reliable and unreliable sources to use on French Wikipedia. | Northern & Western Europe | Wikipedia Page | Wikimedia France | 2020 | Wikipedia |
Open the Knowledge Journalism Award | Disinformation on Wikimedia projects often appears in areas in which there are content voids. For this reason, working on filling knowledge gaps is one way Wikimedia can work to increase trustworthy information on the projects. The Wikimedia Foundation has launched the Open the Knowledge Journalism Awards, which recognizes the essential role journalists play in creating well-researched articles that volunteer editors can use as source materials to develop content on Wikipedia and other Wikimedia projects. The award is focused on articles by African journalists increasing knowledge about the African continent.[10] | Middle East & North Africa / Sub-Saharan Africa | Award | Wikimedia Foundation | 2023 | N/A |
ORES | ORES is a web service and API that provides machine learning as a service for Wikimedia projects maintained by the Machine Learning team. The system is designed to help automate critical wiki-work—for example, vandalism detection and removal.[11] | Global | Tool | Wikimedia Foundation | 2016 | All projects |
Project: Patrolling/Monitoring vandalism (Wikipedia Ukrainian) | This project is no longer active. It was a WikiProject to check all edits that were made by anonymous users. It helped prevent vandalism from remaining in articles for a long time. The project kept a patrol log of anonymous edits according to this filter. The project was supported by Wikimedia Ukraine, but not maintained by Wikimedia Ukraine. It was maintained by individual dedicated editors. | Central & Eastern Europe / Central Asia | Wikipedia Project | Wikimedia Ukraine | 2020 (archived) | Wikipedia |
Project:Sources | This is a WikiProject dedicated to improving the reliability of Wikipedia articles. The volunteers working on this project ensure that articles are supported by reliable sources, answer inquiries on the reliable sources noticeboard, and maintain a list of frequently discussed sources. This project works to achieve the goals of the verifiability and no original research policies. For more information about the reliability of sources at Wikipedia, please visit our FAQ. | Northern & Western Europe | Wikipedia Project | Wikimedia France | 2011 | Wikipedia |
PSS 9 | PSS 9 is an intelligent software agent-administrator (a bot with administrative rights) on the Bulgarian Wikipedia, which employs artificial intelligence to identify and actively counter vandalism, including the spread of mis- and disinformation.[12] | Central & Eastern Europe / Central Asia | Tool | N/A | 2017 | Wikipedia |
Reading Wikipedia in the classroom | Reading Wikipedia in the Classroom is Wikimedia Foundation’s teacher training program. By leveraging Wikipedia as a pedagogical tool, it helps educators and students develop vital media and information literacy skills for the 21st century.[13] | Global | Training | Wikimedia Foundation | 2021 | Wikipedia |
Reliable sources noticeboard | This is a noticeboard in which the community can discuss reliable sources. Guidelines for the use of reliable sources on En.WP can be found at Wikipedia:Reliable sources and Wikipedia:Reliable sources/Perennial sources. | Global | Wikipedia Page | N/A | 2007 | Wikipedia |
Special:AbuseFilter891 | This is a filter that detects predatory publishers and flags them for editors. | North America | Tool | N/A | 2017 | Wikipedia |
Template CiteQ | This template returns a formatted citation from statements stored on a Wikidata item (referred to by its Q identifier or QID) describing a citable source such as a scholarly article. This project allows easy identification of citations to papers which have been redacted or replaced.[14] | N/A | Tool | N/A | 2017 | Wikipedia |
Unreliable guidelines – Reliable sources and marginalized communities in French, English, and Spanish Wikipedias | This is a report of the research project Reading Together: Reliability and Multilingual Global Communities on reliable sources and marginalized communities on Wikipedia. | North America / Latin America & Caribbean | Research | Art+Feminism, WikiCred | 2021 | Wikipedia |
Unreliable/Predatory Source Detector (UPSD) | The Unreliable/Predatory Source Detector (UPSD) is a user script that identifies various unreliable and potentially unreliable sources.[15] | North America | Tool | N/A | 2020 | Wikipedia |
User:JL-Bot/DOI | DOIs are unique identifiers for academic articles. This bot checks those DOIs that are issued by Crossref and can show who is the person associated with the DOI. | North America | Tool | N/A | 2007 | Wikipedia |
Vanity and Predatory Publishing | This page describes vanity and predatory publishers, with links to lists of both and an explanation of why they might be dangerous for Wikipedia articles. Vanity publishers refer to publishing houses where anyone can pay to have a book published. In contrast, predatory publishers refer to an exploitative academic publishing business model that charges publication fees without checking articles' quality or providing editorial services.[16] | N/A | Wikipedia Page | N/A | 2017 | Wikipedia |
Verificaton en Wikipedia / Verific-a-thon on Wikipedia | This is a training event that promotes a fact-checking culture among the Wikipedian community, the I Wikipedia Verific-a-thon. It is a meeting focused on the verification of knowledge related to health issues and, specifically, to the information on COVID-19 found in the free encyclopedia. | Northern & Western Europe | Event | Wikimedia España | 2021 | Wikipedia |
WikiCon Brazil | The theme of the 2022 WikiCon Brazil was disinformation. This theme was developed through trainings, events, and presentations. | Latin America & Caribbean | Event | Wiki Movimento Brasil | 2022 | All projects |
Wikidata Lab XIV, Modelagem com estruturação intencional / Wikidata Lab XIV, Modeling with intentional structuring | This was a training on how to share resources and capabilities for the integration of Wikidata with other Wikimedia projects, especially Wikipedia.[17] | Latin America & Caribbean | Training | Wiki Movimento Brasil | 2019 | Wikidata / Wikipedia |
WikiFactCheckers training | This WikiCred project trains media professionals how to use Wikipedia, Wikidata, and WikiCommons.[18] | Sub-Saharan Africa | Training | WikiCred, Code for Africa | 2023 | Wikipedia / Wikidata / Wikimedia Commons |
Wikimedia and Democracy – The impact of Wikimedia UK's information literacy work on citizen engagement | This is a comprehensive analysis, by Wikimedia UK, of its programs and their impact. In Particular, they analyze the impact of their information literacy work, and how the Wikimedia movement contributes to creating a healthier information environment and stronger democracy. | Northern & Western Europe | Research | Wikimedia UK | 2021 | All projects |
Wikimedia en Chine / Wikimedia in China | This is a translation and comment for the French community of a memo released by the Wikimedia Foundation relative to an incident in the Chinese community. | Northern & Western Europe | Research | Wikimedia France | 2020 | Wikipedia |
Wikimedia in Education | This is a booklet created by Wikimedia UK in collaboration with the University of Edinburgh about a series of case studies illustrating how to use Wikipedia as a tool for education and support learners to understand, navigate and critically evaluate information as well as develop an appreciation for the role and importance of open education. | Northern & Western Europe | Research / Training | Wikimedia UK | 2020 | Wikidata / Wikipedia |
Wikipédia contra a ignorância racional / Wikipedia against rational ignorance | This is a research paper on Wikipedia against "rational ignorance". João Alexandre Peschanski led this project as an educator before becoming Executive Director of Wiki Movement Brazil User Group (Grupo de Usuários Wiki Movimento Brasil). It is framed around a specific behavioral dimension of misinformation in electoral politics. A summary is available on this Wikimedia blog post with a link to a more substantial research paper. It has evolved as a Wikidata-integrated system for improving content on elections in Brazil. | Latin America & Caribbean | Research | Wiki Movimento Brasil | 2016 | Wikipedia |
Wikipedia Huggle | Huggle is a browser intended for dealing with vandalism and other unconstructive edits on Wikimedia projects. Huggle can load and review edits made to Wikipedia in real time, it also helps users identify unconstructive edits and allows them to be reverted quickly. | N/A | Tool | N/A | 2008 | All projects |
Wikipedia Knowledge Integrity Risk Observatory, Wikimedia Research | Monitoring system that allows exploring data and metrics on knowledge integrity risks to compare Wikipedia language editions. Knowledge integrity risks are captured through the volume of high-risk revisions, identified with the language-agnostic revert risk machine learning (ML) model, and their revert ratios over time and across pages. | Global | Research | Wikimedia Foundation | 2021- | N/A |
Wikipedia SWASTHA | Wikipedia SWASTHA is an innovative, collaborative platform uniting multilingual editors from various Wikipedia communities to exchange best practices and jointly edit healthcare pages. Collaborating with diverse government agencies, the United Nations, and the World Health Organization, SWASTHA focuses on enhancing healthcare awareness in local communities by providing accessible, multilingual health information on Wikipedia. Recognized by prominent media outlets for its impactful work, SWASTHA highlights the ethical dissemination of medical information in local languages to empower individuals with limited access to healthcare resources. We invite organizations to join our mission and harness the power of Wikipedia to create a lasting, positive change in global healthcare. | East, Southeast Asia, & Pacific | Wikipedia Project | N/A | 2020 | Wikipedia |
Wikipedia:Deprecated sources | This is a list of sources that are "deprecated," whose use on Wikipedia is discouraged because they "fail the reliable sources guidelines in nearly all circumstances."[19] | N/A | Wikipedia Page | N/A | 2018 | Wikipedia |
Wikipédia:Fontes confiáveis/Central de confiabilidade / Wikipedia:Reliable sources/Trust Center | This is a page created to centralize discussions related to the sources used in articles which lists sources editors should avoid using in the project, many of which have been associated with disinformation campaigns. It is similar to lists of unreliable sources that exist for other language Wikipedias.[20] | Latin America & Caribbean | Wikipedia Page | Wiki Movimento Brasil | 2022 | Wikipedia |
Wikipedia:Nierzetelne źródła / Wikipedia:Not reliable sources | This is an editorial recommendation including a list of unreliable sources that should not be used by Polish Wikipedia editors, with an explanation of how these are selected and why. | Central & Eastern Europe / Central Asia | Wikipedia Page | N/A | 2022 | Wikipedia |
Wikipedia:Reliable sources/Perennial sources | This is a non-exhaustive list of sources whose reliability and use on Wikipedia are frequently discussed. This list summarizes prior consensus and consolidates links to the most in-depth and recent discussions from the reliable sources noticeboard and elsewhere on Wikipedia.[21] | N/A | Wikipedia Page | N/A | 2018 | Wikipedia |
Wikipedia:Trovärdiga källor / Wikipedia:Credible sources | This is an essay explaining how to find and use credible sources in general and in particular in the Swedish language. Many Swedish-speaking Wikipedians have been involved in creating and maintaining this resource. | Northern & Western Europe | Wikipedia Page | N/A | 2008 | Wikipedia |
Wikipedia:Vaccine safety/Reliable sources | Launched in April 2022 as a NewsQ project, Wikipedia “Reliable Sources” seeks to support Wikipedians who write articles on vaccines and want to cite reliable sources of vaccine information. NewsQ is collaborating with experienced Wikipedia editors to gather data on source quality and refine a list of reputable sources of vaccine information throughout 2023. NewsQ is a Hacks/Hackers initiative that has sought to elevate quality journalism when algorithms rank and recommend news online. | N/A | Wikipedia Page | News Quality Initiative (NewsQ), Hacks/Hackers, and Knowledge Futures Group.
For more information, read footnote [2]. |
2022 | Wikipedia |
Wikipedia:WikiProject Climate change | WikiProject Climate change is a collaborative effort to improve Wikipedia articles related to climate change. The WikiProject covers topics related to the causes of climatic change, the effects of climate change, and how society responds in terms of adaptation, mitigation and social and political change.[22] | Global | Wikipedia Page | N/A | 2010 | Wikipedia |
Wikipedia:WikiProject COVID-19/SureWeCan COVID19 Task Force | This task force project focuses on Wikipedia editing and writing to share simple, factual information on the coronavirus pandemic in New York City with the goal of explaining the seriousness of the pandemic.[23] | North America | Wikipedia Project / Edit-a-thon | Sure We Can | 2021 | Wikipedia |
Wikipedia:WikiProject COVID-19 | This is the WikiProject related to Wikipedia coverage of COVID-19. It consolidates and coordinates community efforts to create trustworthy information on the pandemic. It was developed in part also through a partnership with WHO. | Global | Wikipedia Project | N/A | 2020 | Wikipedia |
Wikipedia:WikiProject_Reliability | This is a WikiProject dedicated to improving the reliability of Wikipedia articles. This includes curating reliable sources, answer inquiries on the reliable sources noticeboard, and maintain a list of frequently discussed sources.[24] | N/A | Wikipedia Project | N/A | 2011 | Wikipedia |
Wikipedian in Residence at Bantayog ng mga Bayani | The Bantayog ng mga Bayani is a foundation that maintains a memorial, museum, and library dedicated to remembering and honoring the heroes, martyrs, and victims of the Ferdinand Marcos dictatorship. WikiSocPH partnered with the Bantayog who hired a Wikimedian in Residence (WiR) to help digitize the Bantayog’s library, to improve Wikipedia’s coverage of articles related to the Ferdinand Marcos regime, and to organize Wikimedia events and workshops to the Bantayog’s visitors, researchers, and supporters. | East, Southeast Asia, & Pacific | Fellowship | WikiSocPH | 2018 | Wikipedia |
Віківишколи/Тренінг для журналістів 19 травня 2019 / Wikivishkoli/Training for journalists 19 May 2019 | This is a master class on "How to enter journalism into Wikipedia" that was held in Kyiv in 2019, and organized by Wikimedia Ukraine together with the Institute for the Development of the Regional Press. The event was part of the Investigative Journalism Month held on Ukrainian Wikipedia in June-July. | Central & Eastern Europe / Central Asia | Training | Wikimedia Ukraine | 2019 | Wikipedia |
Вікіпедія:Авторитетні джерела / Wikipedia: Authoritative sources | This is a blog post explaining filter/list of untrustworthy sources that would flag them as untrustworthy for editors to use. | Central & Eastern Europe / Central Asia | Blog post | Wikimedia Ukraine | 2020 | Wikipedia |
ГО «Вікімедіа Україна» попереджає: користуючись Вікіпедією, не платіть шахраям! / NGO "Wikimedia Ukraine" warns: When using Wikipedia, do not pay fraudsters! | This is a blog post explaining how Wikipedia works and warning users against fraud and paid editing. | Central & Eastern Europe / Central Asia | Blog post | Wikimedia Ukraine | 2020 | Wikipedia |
Уикипедия:Патрульори/СФИН / Wikipedia:Patrollers/SFIN | This is a filter maintained by the community of editors preventing the use of unreliable sources in the main space, and also in the incubator, by users who are not members of the "autopatrolled" group. When such a user tries to enter a link to a source in the list, the link is detected by the abuse filter that pulls data from the list and the edit is rejected. | Central & Eastern Europe / Central Asia | Tool | N/A | 2016 (for the abuse filter); 2019 (adopted and enforced as a policy) | Wikipedia |
Як перевіряти інформацію: поради з фактчекінгу від Vox Check / How to check information: Fact-checking tips from Vox Check | This is the abstract of a training on disinformation held by Wikimedia Ukraine in collaboration with Vox Check in 2021. | Central & Eastern Europe / Central Asia | Training | Wikimedia Ukraine | 2021 | Wikipedia |
Як читати новини критично? and Експерти та експертність у медіа / How to read news critically? and Experts and expertise in the media | This is the abstract of two training on disinformation and media literacy which were held by Wikimedia Ukraine in collaboration with Media Detector in 2020. | Central & Eastern Europe / Central Asia | Blog post | Wikimedia Ukraine | 2020 | Wikipedia |
Footnotes
[1] Funded by the National Science Foundation’s Convergence Accelerator, the Analysis and Response Toolkit for Trust (ARTT) project is led by Hacks/Hackers and the Paul G. Allen School of Computer Science & Engineering at the University of Washington. During Phase II of the project, which commenced in October 2022, partner and collaborating organizations include Wikimedia DC, Social Science Research Council (SSRC), Children’s Hospital of Philadelphia, National Public Health Information Coalition, and others. Advisement in Phase II also comes from a member of the World Health Organization’s Vaccine Safety Net.
[2] Affiliates include the News Quality Initiative (NewsQ), Hacks/Hackers, and Knowledge Futures Group, with support from Tow-Knight Center for Entrepreneurial Journalism at the CUNY Newmark Graduate School of Journalism, and Craig Newmark Philanthropies.
Anti-Disinformation Repository (Text Description And Details)
editProject: ¿Cómo funciona Wikipedia? 10 datos fundamentales para entenderla y sus principales desafíos / How does Wikipedia work? 10 fundamental facts to understand it and its main challenges
Description: Document with main points about how Wikipedia works, challenges of the platform and how information is built collaboratively.
Country: Argentina
Region: Global
Type: Document
Language: Spanish
Group: Wikimedia Argentina
Year: 2022
Wikimedia project: Wikipedia
Project: ¡Que no te pille la tecnología! Personas Mayores Informadas / Don't let technology catch you! Senior Citizens Informed
Description: This video was created as part of a campaign to empower the senior population to avoid disinformation when using the internet. Participants were volunteer attendees of the workshop "Senior people and digital environments: tools for integration", organized by Wikimedia Chile, Google Chile, and the Quilicura municipality in Santiago, who shared their own experiences on how they use online information.
Country: Chile
Region: Latin America & Caribbean
Type: Video
Language: Spanish
Group: Wikimedia Chile
Year: 2022
Wikimedia project: N/A
Project: 1 Lib 1 Ref (#1Lib1Ref)
Description: Twice per year, #1Lib1Ref—an abbreviation for “one librarian, one reference”—calls on librarians around the world, and anyone who has a passion for free knowledge, to add missing references to articles on Wikipedia. By increasing the use of reliable sources, the project improves the accuracy of the information.[1]
Country: Multiple
Region: Global
Type: Edit-a-thon
Language: Indonesian, Malay, Bikol, German, English, Esperanto, Dutch, Sundanese, Vietnamese, Turkish, Catalan, Danish, Spanish, Basque, French, Galician, Italian, Latvian, Hungarian, Norwegian, Polish, Portuguese, Brazilian Portuguese, Romanian, Slovenian, Suomi, Swedish, Czech, Greek, Belarusian, Macedonian, Russian, Serbian, Ukrainian, Hebrew, Urdu, Arabic, Sindhi, Farsi, Hindi, Bangla, Tamil, Malayalam, Thai, Burmese, Chinese, and Japanese
Group: Wikimedia Foundation
Year: 2016 (according to Talk page)
Wikimedia project: Wikipedia
Project: ARTT (Analysis and Response Toolkit for Trust)
Description: Funded by the National Science Foundation’s Convergence Accelerator, the ARTT project aims to support trust-building online conversations. The main tool of the project, the ARTT Guide, is a web-based software assistant that provides a framework of possible responses for everyday conversations around tricky topics. By bringing together insights about how to engage in positive exchanges around credible information, and by providing guidance, encouragement, and inspiration, the ARTT Guide will help our users answer the question: “What do I say (and how do I say it)?”
Country: USA
Region: North America
Type: Tool
Language: English
Group: Funded by the National Science Foundation’s Convergence Accelerator, the Analysis and Response Toolkit for Trust (ARTT) project is led by Hacks/Hackers and the Paul G. Allen School of Computer Science & Engineering at the University of Washington. During Phase II of the project, which commenced in October 2022, partner and collaborating organizations include Wikimedia DC, Social Science Research Council (SSRC), Children’s Hospital of Philadelphia, National Public Health Information Coalition, and others. Advisement in Phase II also comes from a member of the World Health Organization’s Vaccine Safety Net
Year: 2021
Wikimedia project: N/A
Project: Ciclo de Charlas "Diálogos Urgentes" / "Urgent Dialogues" Talk Series
Description: Webinar about disinformation, Wikimedia projects, and collaborative coverage from a human rights perspective.
Country: Argentina
Region: Global
Type:: Podcast
Language: Spanish
Group: Wikimedia Argentina
Year: 2020
Wikimedia project: Wikipedia / Wikimedia Commons / Wikidata
Project: Citation Hunt
Description: Citation Hunt is a tool for browsing snippets of Wikipedia articles that lack citations. It is available in several languages and allows browsing snippets either randomly or by their article's categories in a quick and fun way.[2]
Country: N/A
Region: N/A
Type:: Tool
Language: English
Group: N/A
Year: 2017
Wikimedia project: Wikipedia
Project: Cite Unseen
Description: Cite Unseen is a user script that adds categorical icons to Wikipedia citations, such as whether the citation counts as news, opinion, blog, state-run media, a published book, among others. These categorical icons provide readers and editors with a quick initial evaluation of citations at a glance. This helps guide users on the nature and reliability of sources, and to help identify sources that may potentially be problematic or should be used with caution.[3]
Country: USA
Region: North America
Type: Tool
Language: English
Group: WikiCred
Year: 2019
Wikimedia project: Wikipedia
Project: CiteWatch
Description: This is a bot that compiles lists of unreliable sources which are cited by Wikipedians via a specific citation template, based on several lists of unreliable or questionable publications (such as those compiled by Jeffrey Beall, the University Grants Commission, Quackwatch, or Wikipedia editors.[4]
Country: N/A
Region: N/A
Type:Tool
Language: English
Group: N/A
Year: 2018
Wikimedia project: Wikipedia
Project: Completude, consistência e correção em bases de dados digitais sobre mortos e desaparecidos políticos na ditadura civil-militar brasileira / Completeness, consistency, and correction in digital databases about victims and political desaparecidos during the Brazilian civic-military dictatorship
Description: This project is a research focused on improving content on those who were killed or disappeared during the military dictatorship in Brazil (1964–1985). In the context in which the federal government was shutting down websites with official content on human rights violations during the military regime in Brazil, Wiki Movimento Brasil and partners focused on bringing all this information into Wikidata and Wikipedia.
Country: Brazil
Region: Latin America & Caribbean
Type: Research
Language: Brazilian Portuguese
Group: Wiki Movimento Brasil
Year: 2021
Wikimedia project: Wikidata / Wikipedia
Project: Covid-19 et désinformacion / COVID-19 and disinformation
Description: This is a research project on COVID-19 disinformation on Wikipedia.
Country: France
Region: Northern & Western Europe
Type: Research
Language: French
Group: Wikimedia France
Year: 2020
Wikimedia project: Wikipedia
Project: Croatian Wikipedia Disinformation Assessment
Description: This report evaluates a case of long-term systemic disinformation and project capture on the Croatian-language Wikipedia. The report was produced by an external expert, who conducted a thorough analysis of the content and community dynamics on the Croatian language Wikipedia. It revealed a widespread pattern of manipulative behavior and abuse of power by an ideologically aligned group of admins and volunteer editors. The views and opinions expressed in this report are those of the author and do not necessarily reflect the official policy or position of the Wikimedia Foundation.
Country: N/A
Region: Central & Eastern Europe / Central Asia
Type: Research
Language: English
Group: Wikimedia Foundation
Year: 2021
Wikimedia project: Wikipedia
Project: CTRL-F
Description: Wikimedia Chile, Together with CIVIX, CIPER Chile Journalistic Research Center, and Fast Check organizes the CTRL-F program. A training program for students to teach them verification and media literacy techniques.
Country: Chile
Region: Latin America & Caribbean
Type: Training
Language: Spanish
Group: Wikimedia Chile
Year: 2023
Wikimedia project: N/A
Project: Decálogos para la lucha contra la desinformación en Wikipedia / Decalogues for the fight against disinformation on Wikipedia
Description: This is a training on misinformation and access to knowledge, in collaboration with Medialab-Prado. The event touched upon knowledge and strategies to fight disinformation on Wikipedia. It was directed to journalists, people who are dedicated to scientific research, activists, and Wikimedians.
Country: Spain
Region: Northern & Western Europe
Type: Training
Language: Spanish
Group: Wikimedia España
Year: 2020
Wikimedia project: Wikipedia
Project: Disinfo 101
Description: An anti-disinformation training resource created by the Wikimedia Foundation’s Trust and Safety Disinformation team. It provides a basic introduction to terminology associated with disinformation; teaches you how disinformation typically functions within the Wiki ecosystem; and, provides a basic toolkit for dealing with disinformation on Wikimedia projects.
Country: N/A
Region: Global
Type: Training
Language: English
Group: Wikimedia Foundation
Year: 2023
Wikimedia project: All projects
Project: Flagged Revisions
Description: FlaggedRevs (Flagged Revisions) is an extension of MediaWiki software that allows one to flag versions of articles and thus give additional information about quality. This comes with the possibility of changing what an unregistered user sees by default. The technical description can be found at mw:Extension:FlaggedRevs. This makes it easier to increase the article quality and trustworthiness.[5]
Country: N/A
Region: N/A
Type: Tool
Language: All
Group: N/A
Year: 2014/2017 (A 2017 moratorium established that no new Wikis can be added but those that were already active can maintain it)
Wikimedia project: MediaWiki
Project: HRVVMC Wikipedia Edit-a-thons
Description: The Human Rights Violations Victims’ Memorial Commission (HRVVMC), is a government agency tasked with documenting the human rights violations that occurred during the dictator’s regime and to establish a museum and a roll of victims. WikiSocPH worked with HRVVMC to help design training modules to train volunteers on how to edit Wikipedia articles related to the Marcos family and their regime. The Commission has conducted Wikipedia edit-a-thons to commemorate the initial declaration of martial law in 1972 and the Revolution in 1986 that ended the dictatorship.
Country: Philippines
Region: East, Southeast Asia, & Pacific
Type: Edit-a-thon
Language: English
Group: WikiSocPH
Year: 2020–2022
Wikimedia project: Wikipedia
Project: I Encuentro sobre lucha contra la desinformación en Wikipedia / 1st Meeting on the struggle against disinformation on Wikipedia
Description: This is a roundtable discussion with experts on the topic of disinformation on Wikipedia, organized by Wikimedia España in 2020.
Country: Spain
Region: Northern & Western Europe
Type: Event
Language: Spanish
Group: Wikimedia España
Year: 2020
Wikimedia project: Wikipedia
Project: International Science Council and Wikimedia Foundation webinar series: Exploring the opportunities and challenges of Wikipedia for science
Description: The Wikimedia Foundation organized a series of webinars in collaboration with the International Science Council, in order to explain the Wikimedia model and how it contributes to creating trustworthy information. The first webinar, Webinar 1: Managing Knowledge Integrity on Information Platforms was on March 16, 2023 and discussed how Wikipedia is a model to model for safeguard the provenance of scientific information online. Speakers were Diego Sáez Trumper (Wikimedia Foundation), Connie Moon Sheat (Hacks/Hackers), and Amalia Toledo (Wikimedia Foundation). The second webinar, Webinar 2: Building Special Projects on Wikipedia: The Covid Case Study, was on March 30, 2023 and showcased the work of the community, and in particular on the COVID-19 pandemic. The speaker was Netha Hussain, a volunteer who created the WikiProject Covid-19.
Country: United Kingdom
Region: Northern & Western Europe
Type: Event
Language: English
Group: Wikimedia Foundation
Year: 2023
Wikimedia project: Wikipedia
Project: Introdução ao Jornalismo Científico / Introduction to Scientific Journalism
Description: This project relied on a partnership with a research lab, NeuroMat, and is a resource for young professionals to specialize in science journalism. Modules 3 and 6 are focused on how misinformation is spread in Brazil.
Country: Brazil
Region: Latin America & Caribbean
Type: Training
Language: Brazilian Portuguese
Group: Wiki Movimento Brasil
Year: 2017
Wikimedia project: Wikiversity
Project: Knowledge integrity, Wikimedia Research
Description: The Wikimedia Foundation’s Research team published a set of white papers that outline plans and priorities for the next 5 years. These white papers, which were developed collaboratively by all members of the team, reflect our thinking about the kind of research necessary to further the 2030 Wikimedia Strategic Direction of Knowledge Equity and Knowledge as a Service.[6]
Country: N/A
Region: Global
Type: Research
Language: English
Group: Wikimedia Foundation
Year: 2019-
Wikimedia project: N/A
Project: Laboratorio Wikimedia de verificación de datos / Wikimedia workshop on data verification
Description: This is a training called Wikimedia fact-checking lab: learning and collaboration around Wikimedia projects whose goal is to investigate information and data verification to defend Wikipedia from misinformation.
Country: Spain
Region: Northern & Western Europe
Type: Training
Language: Spanish
Group: Wikimedia España
Year: 2021
Wikimedia project: Wikipedia
Project: Machine Learning models, Wikimedia Research
Description: The Wikimedia Foundation’s Research team has built machine learning models to assist patrollers in detecting revisions that might be reverted, independently of whether they were made in good faith or with the intention of creating damage. Previous models were language-specific solutions (e.g., ORES), which are difficult to escalate and maintain, and are just available in certain languages. In contrast, this new generation of machine learning models are based on language-agnostic features, making it possible to use them for any existing Wikipedia, and for new language projects that can appear in the future.
Country: N/A
Region: Global
Type: Tool
Language: N/A
Group: Wikimedia Foundation
Year: 2022
Wikimedia project: Wikipedia
Project: masz – Media Wiki
Description: masz (the z is silent) is a tool that helps Checkusers (a small group of WP users who work on spotting fake accounts), see similarity of style and language in talk pages to find users with multiple accounts.[7]
Country: N/A
Region: N/A
Type: Tool
Language: English, French, Indonesian, Swedish, Italian, Czech, Spanish, Simple English, Portuguese
Group: Wikimedia Foundation
Year: 2021
Wikimedia project: Wikipedia / Wikidata / Wikinews
Project: Moderator Tools
Description: The Moderator Tools team is a Wikimedia Foundation product team working on content moderation tool needs.[8]
Country: N/A
Region: N/A
Type: Tool
Language: All
Group: Wikimedia Foundation
Year: 2021-
Wikimedia project: Wikipedia
Project: New Page Patrol Source Guide
Description: This page centralizes information about reliable sources for new page reviewers to use when they review new articles. It is intended as a supplement to the reliable sources noticeboard and List of Perennial Sources, to help page reviewers unfamiliar with a given subject assess notability and neutrality of an article.[9]
Country: N/A
Region: N/A
Type: Wikipedia Page
Language: English
Group: N/A
Year: 2019
Wikimedia project: Wikipedia
Project: Observatoire de sources / Sources observatory
Description: This is a list of reliable and unreliable sources to use on French Wikipedia.
Country: France
Region: Northern & Western Europe
Type: Wikipedia Page
Language: French
Group: Wikimedia France
Year: 2020
Wikimedia project: Wikipedia
Project: Open the Knowledge Journalism Award
Description: Disinformation on Wikimedia projects often appears in areas in which there are content voids. For this reason, working on filling knowledge gaps is one way Wikimedia can work to increase trustworthy information on the projects. The Wikimedia Foundation has launched the Open the Knowledge Journalism Awards, which recognizes the essential role journalists play in creating well-researched articles that volunteer editors can use as source materials to develop content on Wikipedia and other Wikimedia projects. The award is focused on articles by African journalists increasing knowledge about the African continent.[10]
Country: N/A
Region: Middle East & North Africa / Sub-Saharan Africa
Type: Award
Language: English
Group: Wikimedia Foundation
Year: 2023
Wikimedia project: N/A
Project: ORES
Description: ORES is a web service and API that provides machine learning as a service for Wikimedia projects maintained by the Machine Learning team. The system is designed to help automate critical wiki-work—for example, vandalism detection and removal.[11]
Country: N/A
Region: Global
Type: Tool
Language: Multiple
Group: Wikimedia Foundation
Year: 2016
Wikimedia projects: All projects
Project: Project: Patrolling/Monitoring vandalism (Wikipedia Ukrainian)
Description: This project is no longer active. It was a WikiProject to check all edits that were made by anonymous users. It helped prevent vandalism from remaining in articles for a long time. The project kept a patrol log of anonymous edits according to this filter. The project was supported by Wikimedia Ukraine, but not maintained by Wikimedia Ukraine. It was maintained by individual dedicated editors.
Country: Ukraine
Region: Central & Eastern Europe / Central Asia
Type: Wikipedia Project
Language: Ukrainian
Group: Wikimedia Ukraine
Year: 2020 (archived)
Wikimedia project: Wikipedia
Project: Project:Sources
Description: This is a WikiProject dedicated to improving the reliability of Wikipedia articles. The volunteers working on this project ensure that articles are supported by reliable sources, answer inquiries on the reliable sources noticeboard, and maintain a list of frequently discussed sources. This project works to achieve the goals of the verifiability and no original research policies. For more information about the reliability of sources at Wikipedia, please visit our FAQ.
Country: France
Region: Northern & Western Europe
Type: Wikipedia Project
Language: French
Group: Wikimedia France
Year: 2011
Wikimedia project: Wikipedia
Project: PSS 9
Description: PSS 9 is an intelligent software agent-administrator (a bot with administrative rights) on the Bulgarian Wikipedia, which employs artificial intelligence to identify and actively counter vandalism, including the spread of mis- and disinformation.[12]
Country: Bulgaria
Region: Central & Eastern Europe / Central Asia
Type: Tool
Language: Bulgarian
Group: N/A
Year: 2017
Wikimedia project: Wikipedia
Project: Reading Wikipedia in the classroom
Description: Reading Wikipedia in the Classroom is Wikimedia Foundation’s teacher training program. By leveraging Wikipedia as a pedagogical tool, it helps educators and students develop vital media and information literacy skills for the 21st century.[13]
Country: N/A
Region: Global
Type: Training
Language: English, Arabic, French, Spanish, Tagalog, Adapted English (Nigeria), Yoruba, Indonesian, Aymara, Guaranì, Quechua, Catalan, Dagbani
Group: Wikimedia Foundation
Year: 2021
Wikimedia project: Wikipedia
Project: Reliable sources noticeboard
Description: This is a noticeboard in which the community can discuss reliable sources. Guidelines for the use of reliable sources on En.WP can be found at Wikipedia:Reliable sources and Wikipedia:Reliable sources/Perennial sources.
Country: N/A
Region: Global
Type: Wikipedia Page
Language: English
Group: N/A
Year: 2007
Wikimedia project: Wikipedia
Project: Special:AbuseFilter891
Description: This is a filter that detects predatory publishers and flags them for editors.
Country: USA
Region: North America
Type: Tool
Language: English
Group: N/A
Year: 2017
Wikimedia project: Wikipedia
Project: Template CiteQ
Description: This template returns a formatted citation from statements stored on a Wikidata item (referred to by its Q identifier or QID) describing a citable source such as a scholarly article. This project allows easy identification of citations to papers which have been redacted or replaced.[14]
Country: N/A
Region: N/A
Type: Tool
Language: English
Group: N/A
Year: 2017
Wikimedia project: Wikipedia
Description: This is a report of the research project Reading Together: Reliability and Multilingual Global Communities on reliable sources and marginalized communities on Wikipedia.
Country: Canada / USA / Peru
Region: North America / Latin America & Caribbean
Type: Research
Language: English
Group: Art+Feminism, WikiCred
Year: 2021
Wikimedia project: Wikipedia
Project: Unreliable/Predatory Source Detector (UPSD)
Description: The Unreliable/Predatory Source Detector (UPSD), is a user script that identifies various unreliable and potentially unreliable sources.[15]
Country: N/A
Region: North America
Type: Tool
Language: English
Group: N/A
Year: 2020
Wikimedia project: Wikipedia
Project: User:JL-Bot/DOI
Description: DOIs are unique identifiers for academic articles. This bot checks those DOIs that are issued by Crossref and can show who is the person associated with the DOI.
Country: USA
Region: North America
Type: Tool
Language: English
Group: N/A
Year: 2007
Wikimedia project: Wikipedia
Project: Vanity and Predatory Publishing
Description: This page describes vanity and predatory publishers, with links to lists of both and an explanation of why they might be dangerous for Wikipedia articles. Vanity publishers refer to publishing houses where anyone can pay to have a book published. In contrast, predatory publishers refer to an exploitative academic publishing business model that charges publication fees without checking articles' quality or providing editorial services.[16]
Country: N/A
Region: N/A
Type: Wikipedia Page
Language: English
Group: N/A
Year: 2017
Wikimedia project: Wikipedia
Project: Verificatón en Wikipedia / Verific-a-thon on Wikipedia
Description: This is a training event that promotes a fact-checking culture among the Wikipedian community, the I Wikipedia Verific-a-thon. It is a meeting focused on the verification of knowledge related to health issues and, specifically, to the information on COVID-19 found in the free encyclopedia.
Country: Spain
Region: Northern & Western Europe
Type: Event
Language Spanish
Group: Wikimedia España
Year: 2021
Wikimedia project: Wikipedia
Project: WikiCon Brazil
Description: The theme of the 2022 WikiCon Brazil was disinformation. This theme was developed through trainings, events, and presentations.
Country: Brazil
Region: Latin America & Caribbean
Type: Event
Language: Brazilian Portuguese
Group: Wiki Movimento Brasil
Year: 2022
Wikimedia project: All projects
Project: Wikidata Lab XIV, Modelagem com estruturação intencional / Wikidata Lab XIV, Modeling with intentional structuring
Description: This was a training on how to share resources and capabilities for the integration of Wikidata with other Wikimedia projects, especially Wikipedia.[17]
Country: Brazil
Region: Latin America & Caribbean
Type: Training
Language: Brazilian Portuguese
Group: Wiki Movimento Brasil
Year: 2019
Wikimedia project: Wikidata / Wikipedia
Project: WikiFactCheckers training
Description: This WikiCred project trains media professionals how to use Wikipedia, Wikidata, and WikiCommons.[18]
Country: Nigeria
Region: Sub-Saharan Africa
Type: Training
Language: English
Group: WikiCred, Code for Africa
Year: 2023
Wikimedia project: Wikipedia / Wikidata / Wikimedia Commons
Description: This is a comprehensive analysis, by Wikimedia UK, of its programs and their impact. In Particular, they analyze the impact of their information literacy work, and how the Wikimedia movement contributes to creating a healthier information environment and stronger democracy.
Country: United Kingdom
Region: Northern & Western Europe
Type: Research
Language: English
Group: Wikimedia UK
Year: 2021
Wikimedia project: All projects
Project: Wikimedia en Chine / Wikimedia in China
Description: This is a translation and comment for the French community of a memo released by the Wikimedia Foundation relative to an incident in the Chinese community.
Country: France
Region: Northern & Western Europe
Type: Research
Language: French
Group: Wikimedia France
Year: 2020
Wikimedia project: Wikipedia
Project: Wikimedia in Education
Description: This is a booklet created by Wikimedia UK in collaboration with the University of Edinburgh about a series of case studies illustrating how to use Wikipedia as a tool for education and support learners to understand, navigate and critically evaluate information as well as develop an appreciation for the role and importance of open education.
Country: United Kingdom
Region: Northern & Western Europe
Type: Research / Training
Language: English
Group: Wikimedia UK
Year: 2020
Wikimedia project: Wikidata / Wikipedia
Project: Wikipédia contra a ignorância racional / Wikipedia against rational ignorance
Description: This is a research paper on Wikipedia against "rational ignorance". João Alexandre Peschanski led this project as an educator before becoming Executive Director of Wiki Movement Brazil User Group (Grupo de Usuários Wiki Movimento Brasil), and it is framed around a specific behavioral dimension of misinformation in electoral politics. A summary is available on this Wikimedia blog post (with a link to a more substantial research paper), and it has evolved as a Wikidata-integrated system for improving content on elections in Brazil.
Country: Brazil
Region: Latin America & Caribbean
Type: Research
Language: Brazilian Portuguese
Group: Wiki Movimento Brasil
Year: 2016
Wikimedia project: Wikipedia
Project: Wikipedia Huggle
Description: Huggle is a browser intended for dealing with vandalism and other unconstructive edits on Wikimedia projects. Huggle can load and review edits made to Wikipedia in real time, it also helps users identify unconstructive edits and allows them to be reverted quickly.
Country: N/A
Region: N/A
Type: Tool
Language: Spanish, French, Portuguese, English, Chinese, Dutch, Bulgarian, Russian, Serbian, Ukrainian, Azerbaijani, Bosnian, Catalan, Danish, German, Estonian, Croatian, Latvian, Norwegian, Polish, Romanian, Serbo-Croatian, Swedish, Turkish, Czech, Georgian, Urdu, Arabic, Persian, Kurdish, Hebrew, Japanese, Korean, Hindi, Bangla, Odia, Khmer, Vietnamese, Indonesian
Group: N/A
Year: 2008
Wikimedia project: All projects
Project: Wikipedia Knowledge Integrity Risk Observatory, Wikimedia Research
Description: Monitoring system that allows exploring data and metrics on knowledge integrity risks to compare Wikipedia language editions. Knowledge integrity risks are captured through the volume of high-risk revisions, identified with the language-agnostic revert risk machine learning (ML) model, and their revert ratios over time and across pages.
Country: N/A
Region: Global
Type: Research
Language: English
Group: Wikimedia Foundation
Year: 2021-
Wikimedia project: N/A
Project: Wikipedia SWASTHA
Description: Wikipedia SWASTHA is an innovative, collaborative platform uniting multilingual editors from various Wikipedia communities to exchange best practices and jointly edit healthcare pages. Collaborating with diverse government agencies, the United Nations, and the World Health Organization, SWASTHA focuses on enhancing healthcare awareness in local communities by providing accessible, multilingual health information on Wikipedia. Recognized by prominent media outlets for its impactful work, SWASTHA highlights the ethical dissemination of medical information in local languages to empower individuals with limited access to healthcare resources. We invite organizations to join our mission and harness the power of Wikipedia to create a lasting, positive change in global healthcare.
Country: N/A
Region: East, Southeast Asia, & Pacific
Type: Wikipedia Project
Language: English, Urdu, Hindi, Bengali, Marathi, Telugu, Tamil, Gujarati, Kannada, Oddia, Malayalam, Punjabi, Bhojpuri, Maithili, and Nepali.
Group: N/A
Year: 2020
Wikimedia project: Wikipedia
Project: Wikipedia:Deprecated sources
Description: This is a list of sources that are "deprecated," whose use on Wikipedia is discouraged because they "fail the reliable sources guidelines in nearly all circumstances."[19]
Country: N/A
Region: N/A
Type: Wikipedia Page
Language: English, Russian, Polish
Group: N/A
Year: 2018
Wikimedia project: Wikipedia
Project: Wikipédia:Fontes confiáveis/Central de confiabilidade / Wikipedia:Reliable sources/Trust Center
Description: This is a page created to centralize discussions related to the sources used in articles which lists sources editors should avoid using in the project, many of which have been associated with disinformation campaigns. It is similar to lists of unreliable sources that exist for other language Wikipedias.[20]
Country: Brazil
Region: Latin America & Caribbean
Type: Wikipedia Page
Language: Brazilian Portuguese
Group: Wiki Movimento Brasil
Year: 2022
Wikimedia project: Wikipedia
Project: Wikipedia:Nierzetelne źródła / Wikipedia:Not reliable sources
Description: This is an editorial recommendation including a list of unreliable sources that should not be used by Polish Wikipedia editors, with an explanation of how these are selected and why.
Country: Poland
Region: Central & Eastern Europe / Central Asia
Type: Wikipedia Page
Language: Polish
Group: N/A
Year: 2022
Wikimedia project: Wikipedia
Project: Wikipedia:Reliable sources/Perennial sources
Description: This is a non-exhaustive list of sources whose reliability and use on Wikipedia are frequently discussed. This list summarizes prior consensus and consolidates links to the most in-depth and recent discussions from the reliable sources noticeboard and elsewhere on Wikipedia.[21]
Country: N/A
Region: N/A
Type: Wikipedia Page
Languge: English
Group: N/A
Year: 2018
Wikimedia project: Wikipedia
Project: Wikipedia:Trovärdiga källor / Wikipedia:Credible sources
Description: This is an essay explaining how to find and use credible sources in general and in particular in the Swedish language. Many Swedish-speaking Wikipedians have been involved in creating and maintaining this resource.
Country: N/A
Region: Northern & Western Europe
Type: Wikipedia Page
Language: Swedish
Group: N/A
Year: 2008
Wikimedia project: Wikipedia
Project: Wikipedia:Vaccine safety/Reliable sources
Description: Launched in April 2022 as a NewsQ project, Wikipedia “Reliable Sources” seeks to support Wikipedians who write articles on vaccines and want to cite reliable sources of vaccine information. NewsQ is collaborating with experienced Wikipedia editors to gather data on source quality and refine a list of reputable sources of vaccine information throughout 2023. NewsQ is a Hacks/Hackers initiative that has sought to elevate quality journalism when algorithms rank and recommend news online.
Country: N/A
Region: N/A
Type: Wikipedia Page
Language: English
Group: Affiliates include the News Quality Initiative (NewsQ), Hacks/Hackers, and Knowledge Futures Group, with support from Tow-Knight Center for Entrepreneurial Journalism at the CUNY Newmark Graduate School of Journalism, and Craig Newmark Philanthropies.
Year: 2022
Wikimedia project: Wikipedia
Project: Wikipedia:WikiProject Climate change
Description: WikiProject Climate change is a collaborative effort to improve Wikipedia articles related to climate change. The WikiProject covers topics related to the causes of climatic change, the effects of climate change, and how society responds in terms of adaptation, mitigation and social and political change.[22]
Country: N/A
Region: Global
Type: Wikipedia Page
Language: English, German, Spanish, French, Polish, Swedish, Vietnamese, Czech
Group: N/A
Year: 2010
Wikimedia project: Wikipedia
Project: Wikipedia:WikiProject COVID-19/SureWeCan COVID19 Task Force
Description: This task force project focuses on Wikipedia editing and writing to share simple, factual information on the coronavirus pandemic in New York City with the goal of explaining the seriousness of the pandemic.[23]
Country: USA
Region: North America
Type: Wikipedia Project / Edit-a-thon
Language: English, Spanish, Chinese, Russian, Italian, Haitian Creole, Korean, French, Tagalog, polish, Yiddish, Japanese, Hindi, Hebrew, Swahili, Malagasy, Yoruba, German, Portuguese, Arabic, Hungarian, Bengali, Greek
Group: Sure We Can
Year: 2021
Wikimedia project: Wikipedia
Project: Wikipedia:WikiProject COVID-19
Description: This is the WikiProject related to Wikipedia coverage of COVID-19. It consolidates and coordinates community efforts to create trustworthy information on the pandemic. It was developed in part also through a partnership with WHO.
Country: N/A
Region: Global
Type: Wikipedia Project
Language: English, Urdu, Chinese, Korean, Russian, Spanish, Italian, Czech, Arabic, Shindi, Greek, Malay, Croatian, Dutch, Slovak, Finnish, Turkish, Marathi, Bangla, Punjabi, Telugu, Malayalam, Thai
Group: N/A
Year: 2020
Wikimedia project: Wikipedia
Project: Wikipedia:WikiProject_Reliability
Description: This is a WikiProject dedicated to improving the reliability of Wikipedia articles. This includes curating reliable sources, answer inquiries on the reliable sources noticeboard, and maintain a list of frequently discussed sources.[24]
Country: N/A
Region: N/A
Type: Wikipedia Project
Language: English
Group: N/A
Year: 2011
Wikimedia project: Wikipedia
Project: Wikipedian in Residence at Bantayog ng mga Bayani
Description: The Bantayog ng mga Bayani is a foundation that maintains a memorial, museum, and library dedicated to remembering and honoring the heroes, martyrs, and victims of the Ferdinand Marcos dictatorship. WikiSocPH partnered with the Bantayog who hired a Wikimedian in Residence (WiR), to help digitize the Bantayog’s library, to improve Wikipedia’s coverage of articles related to the Ferdinand Marcos regime, and to organize Wikimedia events and workshops to the Bantayog’s visitors, researchers, and supporters.
Country: Philippines
Region: East, Southeast Asia, & Pacific
Type: Fellowship
Language: English
Group: WikiSocPH
Year: 2018
Wikimedia project: Wikipedia
Project: Віківишколи/Тренінг для журналістів 19 травня 2019 / Wikivishkoli/Training for journalists 19 May 2019
Description: This is a master class on "How to enter journalism into Wikipedia" that was held in Kyiv in 2019, and organized by Wikimedia Ukraine together with the Institute for the Development of the Regional Press. The event was part of the Investigative Journalism Month held on Ukrainian Wikipedia in June-July.
Country: Ukraine
Region: Central & Eastern Europe / Central Asia
Type: Training
Language: Ukrainian
Group: Wikimedia Ukraine
Year: 2019
Wikimedia project: Wikipedia
Project: Вікіпедія:Авторитетні джерела / Wikipedia: Authoritative sources
Description: This is a blog post explaining filter/list of untrustworthy sources that would flag them as untrustworthy for editors to use.
Country: Ukraine
Region: Central & Eastern Europe / Central Asia
Type: Blog post
Language: Ukrainian
Group: Wikimedia Ukraine
Year: 2020
Wikimedia project: Wikipedia
Project: ГО «Вікімедіа Україна» попереджає: користуючись Вікіпедією, не платіть шахраям! / NGO "Wikimedia Ukraine" warns: When using Wikipedia, do not pay fraudsters!
Description: This is a blog post explaining how Wikipedia works and warning users against fraud and paid editing.
Country: Ukraine
Region: Central & Eastern Europe / Central Asia
Type: Blog post
Language: Ukrainian
Group: Wikimedia Ukraine
Year: 2020
Wikimedia project: Wikipedia
Project: Уикипедия:Патрульори/СФИН / Wikipedia:Patrollers/SFIN
Description: This is a filter maintained by the community of editors preventing the use of unreliable sources in the main space, and also in the incubator, by users who are not members of the "autopatrolled" group. When such a user tries to enter a link to a source in the list, the link is detected by the abuse filter that pulls data from the list and the edit is rejected.
Country: Bulgaria
Region: Central & Eastern Europe / Central Asia
Type: Tool
Language: Bulgarian
Group: N/A
Year: 2016 (for the abuse filter); 2019 (adopted and enforced as a policy)
Wikimedia project: Wikipedia
Project: Як перевіряти інформацію: поради з фактчекінгу від Vox Check / How to check information: Fact-checking tips from Vox Check
Description: This is the abstract of a training on disinformation held by Wikimedia Ukraine in collaboration with Vox Check in 2021.
Country: Ukraine
Region: Central & Eastern Europe / Central Asia
Type: Training
Language: Ukrainian
Group: Wikimedia Ukraine
Year: 2021
Wikimedia project: Wikipedia
Project: Як читати новини критично? and Експерти та експертність у медіа / How to read news critically? and Experts and expertise in the media
Description: This is the abstract of two trainings on disinformation and media literacy that were held by Wikimedia Ukraine in collaboration with Media Detector in 2020.
Country: Ukraine
Region: Central & Eastern Europe / Central Asia
Type: Blog post
Language: Ukrainian
Group: Wikimedia Ukraine
Year: 2020
Wikimedia project: Wikipedia
BIBLIOGRAPHICAL REFERENCES
- ↑ a b "The Wikipedia Library/1Lib1Ref – Meta". meta.wikimedia.org. Retrieved 2023-06-08.
- ↑ a b "Citation Hunt – Meta". meta.wikimedia.org. Retrieved 2023-06-08.
- ↑ a b User:SuperHamster/CiteUnseen, 2023-01-03, retrieved 2023-06-08.
- ↑ a b Wikipedia:WikiProject Academic Journals/Journals cited by Wikipedia/Questionable1, 2023-06-05, retrieved 2023-06-08.
- ↑ a b "Flagged Revisions – Meta". meta.wikimedia.org. Retrieved 2023-06-08.
- ↑ a b Zia, Leila; Johnson, Isaac; M, B.; Morgan, Jonathan; Redi, Miriam; Saez-Trumper, Diego; Taraborelli, Dario (2019-02-14). "Knowledge Integrity – Wikimedia Research 2030". doi:10.6084/m9.figshare.7704626.v2. Retrieved 2023-06-08.
- ↑ a b "User:Ladsgroup/masz – MediaWiki". www.mediawiki.org. Retrieved 2023-06-08.
- ↑ a b "Moderator Tools – MediaWiki". www.mediawiki.org. Retrieved 2023-06-08.
- ↑ a b Wikipedia:New page patrol source guide, 2023-06-05, retrieved 2023-06-08.
- ↑ a b "Open the Knowledge Journalism Awards". Wikimedia Foundation (in en-US). Retrieved 2023-06-08.
- ↑ a b "ORES – MediaWiki". www.mediawiki.org. Retrieved 2023-06-08.
- ↑ a b "User:PSS 9 – Meta". meta.wikimedia.org. Retrieved 2023-06-08.
- ↑ a b "Education/Reading Wikipedia in the Classroom – Meta". meta.wikimedia.org. Retrieved 2023-06-08.
- ↑ a b Template:Cite Q, 2021-08-21, retrieved 2023-06-08.
- ↑ a b User:Headbomb/unreliable, 2023-06-02, retrieved 2023-06-08.
- ↑ a b Wikipedia:Vanity and predatory publishing, 2022-11-17, retrieved 2023-06-08.
- ↑ a b Wikipédia:Edit-a-thon/Atividades em português/Wikidata Lab XIV (in Portuguese), 2020-12-11, retrieved 2023-06-08.
- ↑ a b MisinfoCon Guest Contributor (2023-04-24). "Training media professionals to be “WikiFactCheckers”". Medium. Retrieved 2023-06-08.
- ↑ a b Wikipedia:Deprecated sources, 2023-05-20, retrieved 2023-06-08.
- ↑ a b Wikipédia:Fontes confiáveis/Central de confiabilidade (in Portuguese), 2023-06-05, retrieved 2023-06-08.
- ↑ a b Wikipedia:Reliable sources/Perennial sources, 2023-06-07, retrieved 2023-06-08.
- ↑ a b Wikipedia:WikiProject COVID-19/SureWeCan COVID19 Task Force, 2022-09-12, retrieved 2023-06-08.
- ↑ a b Wikipedia:WikiProject COVID-19/SureWeCan COVID19 Task Force, 2022-09-12, retrieved 2023-06-08.
- ↑ a b Wikipedia:WikiProject Reliability, 2023-06-07, retrieved 2023-06-08.