Research:Developing Wikimedia Impact Metrics as a Sociotechnical Solution for Encouraging Funder and Academic Engagement

This page documents a research project in progress.
Information may be incomplete and change as the project progresses.
Please contact the project lead before formally citing or reusing results from this page.



Access the full Stage II proposal on Open Review:

The main problems we seek to solve are the low levels of academic engagement in Wikimedia, and the (lack of) professional recognition that academics and volunteers receive for engaging (Jemielniak & Aibar, 2016; Konieczny, 2016). Our project addresses these by researching, developing, and making available so-called Wikimedia Impact Metrics, which demonstrate the impact of contributions. In conjunction with this, we aim to build a consortium of researchers in the academic community to push funders and universities to value these contributions and apply for larger grants.

The basic argument is that scientists want to maketheir knowledge open, and Wikimedia is where the public looks for information, but few scientists engage with Wikimedia. Research suggests this is in part due to a lack of incentives to engage (Chen et al., 2023; Kincaid et al., 2021; Taraborelli et al., 2011), and thus we develop Wikimedia Impact Metrics and encourage funders to value them as grant outcomes. At the end of the grant, we expect to have built a community with the survey, developed some basic metrics, and a demonstrator that the community canprovide feedback on.

About the research team


Brett Buttliere


is an Assistant Professor at the Centre for European, Regional, and Local Studies, University of Warsaw. He has done research on the history of science using Wikimedia, and on howWikimediacanhelp makescience open. He works ondigital infrastructure for science in general, especially by keeping in mind the psychology of the scientist, since they are the ones actually doing the science and making the decisions.

Matthew Vetter


is a Professor of English at Indiana University of Pennsylvania. He co-chairs the CCCC Wikipedia Initiative, a disciplinary project to involve more English Studies academics to contribute to Wikimedia via WikiProject Writing. He is the author, with Zach McDowell, of Wikipedia and the Representation of Reality (Routledge, 2021). He is a veteran instructor with Wiki Education, as well as a researcher on Wikipedia-based education. Hehasalso served on the Wikimedia Foundationʼs North America regional grants committee for the past three years.

Sage Ross


is Chief Technology Officer of Wiki Education and the main developer behind the P&EDashboard. Hehasdeveloped many metrics related to Wikimedia and its utilization, and the P&EDashboardhasbeenusedacross hundreds of initiatives, programs and systems wikimedia wide. He has extensive background and experience with developing websites and successful Wikimedia tools.



There are four specific goals of the project,

  1. Survey and interview Wikimedians to understand what metrics are good for indicating impact.
  2. Develop some basicm etrics of impact including contributions, pageviews, clickthroughs, and others.
  3. Develop a minimally viable product (MVP) that presents some basic statistics such as edits and views.
  4. Develop and apply for larger grants including conference and EU grants for e.g., science communication.

Our goal is to understand how Wikimedia can strengthen its position in the knowledge ecosystem through the engagement of academics and other experts, especially through the development of metrics of impact. This goal is particularly relevant because it provides professional credit to those already in the movementaswell as encouraging others to join (Strategy 2030 1.1: Support Volunteers).

We approach this topic from three directions, each corresponding to one co-PI, with each estimated at approximately 3-4 months work.

Methods for the survey


The idea is to survey Wikimedians, and organizers of events at academic associations (as represented by the organizer), on which metrics they think indicate impact. The sample for the survey will come from two sources. First, our project team is currently engaged in a Rapid Grant where we examine and collect data on how various professional organizations are engaging with Wikimedia. This project involves surveying 200 academic associationʼs websites for Wikimedia related activities and identifying the people in these organizations who have engaged or helped develop Wikimedia related activities within those organizations (so called organizational champions). Aside from these organizational champions, an identical but separate survey will be spread throughout our networks, especially the Wikimedia Research Group, Wikimedia Education, Edu Wiki User Group, CCCC Wikipedia Initiative / Wiki Project Writing, OpenCon, SPARCOpenForum. Additionally, Chris Shilling has offered to let us email those they have funded in the past, to ask how they would measure their impact. Finally, if needed, we intend to leverage the network of Jamie Mathewson at Wiki Education to identify champions who might have interest in the topic to take the survey. The current draft of the survey is available here, and we welcome feedback on its further development including any particular questions to be asked or themes to be investigated.

Methods for metric development


The goal is to develop bibliometric indicators for researchers. Establishing effective bibliometrics involves a series of standard steps and set points which are used to understand the newtools and in comparison to other metrics. Aside from these individual metrics are necessary considerations at the aggregate level. The current P&E dashboard is for looking across users, and it will likely be of use to funders and decision makers to be able to see e.g., across several years of their funding efforts whois having impact, and in what areas. Currently, our team is considering metrics of contributions and metrics of impact, as laid out in Table 1. Many of the metrics are ʻstandardʼ in the sense that they are already made available through the P&E Dashboard.

Table 1: Proposed impact metrics
Contribution Impact
Words added Views
Articles added Views
References Click-throughs
Wikilinks Click-throughs
External links Click-throughs
DOIs added Click-throughs
Wikimedia uploads Downloads

These two different types of metrics match fairly well onto existing metrics both in science and e.g., online business (views and click throughs). In science, these can be the creation of data or papers, and then citations.

A more important aspect of this work will be determining which edits are relevant to the grant and which are not,especially when users have multiple ongoing grants for which not all edits are relevant. Currently, the P&E Dashboard allows users to choose target articles and categories to include in metrics, but we suspect that we will also need to be able to select edits to be e.g., within the grant date, and also somemechanismswill need to be considered to prevent double dipping. Finally, it will be important to think about how the metrics can and will be “gamed” (Oravec, 2019) or used (and to work toward its prevention).

Methods for tool development


The main goal of WP3 is the development of a Minimally Viable Product (MVP) that can be presented to the community for feedback. This project will be based on the P&E Dashboard. The P&E Dashboard is mostly used ʻinternallyʼ, within Wikimedia or within classrooms or particular programs or events to track contributions and impact. The goal here will be to rework it in small ways such that it is more accessible and useful for individual researchers, universities, and funders, compared to Wikimedia program directors looking to track developments.

Each campaign can be a grant class. This reworking will involve renaming some various portions, and most importantly, enabling authors to easily package particular edits as ʻproject relevantʼ in a way that then funders or universities can examine and isolate as parts of e.g., tenure packets or grant outcomes. This is the major research and development part of the project. The idea is to stick as closely as possible to the P&EDashboard, and mostly focusing on making it more administration and evaluation friendly. Most of the metrics that are used on the P&E dashboard will remain, though we look to develop them with the results of the survey. Wedoexpecttodevelop a newwebsite to accommodate this tool. and are available for approximately 50$ per year. Developing this functionality will serve two purposes, since the technology will also be implemented into the P&E Dashboard, such that editors will have better control over which of their edits are program and event relevant.

Presenting, team building, and grant writing


A significant portion of the work is presenting the work, team building, networking, and identifying people to further develop the tool with. It does not really make sense to build this tool unless we will market and advertise it so that it can be effectively used. Weexpect at the end of the year to have presented the project at least three times at the Wiki Workshop, Wikimania 2024 (Katowice), as well as at the Research Evaluation in the Social Sciences and Humanities Conference in Galway, Ireland. These opportunities represent excellent venues to gather opinions both within the Wikimedia Community as well as the Research Evaluation community. Building this team is more than side or part time work, and wechangedthe project from Stage 1 to make this work moree xplicit, since it is not an insignificant amount of work to build a team or apply for large scale grants. These work packages combine quantitative, investigative, and qualitative research to solidify Wikimediaʼs position as an Open Knowledge platform, positioning Wikipedia as an interface between Science and the Public.

Expected output


This grant will support the development of several expected outputs, including at least one scientific paper outlining the argumentation and survey results, the development of the website, several conference papers, and the development and submission of at least one grant to support continued development.

Publish an academic paper


Our goal is to publish a paper and several conference papers outlining the work and idea. The goal of the main paper will be to have widespread impact and an introduction to the tool with examples making the argument in a major journal. We will target high impact journals such as Science or Nature, but also concede that the paper may be published in Scientometrics, Quantitative Science Studies, or other venues where bibliometrics are valued.

Minimally viable website and tool


Creating the metrics and functionality are not useful if they are not made available to the public. Thus, we expect to develop a minimally viable product (MVP) which is not intended to be afinalproduct, but the most basic functions possible, which we can receive feedback on. Conference presentations and panel As outlined in the Methods section, we expect at the end of the year to have presented it at least three times at the Wiki Workshop, Wikimania 2024 (Katowice), as well as at the Research Community impact plan Evaluation in the Social Sciences and Humanities Conference in Galway, Ireland or other relevant conference.




Policy, Ethics and Human Subjects Research











Arroyo-Machado, W., Díaz-Faes, A. A., Herrera-Viedma, E., & Costas, R. (2023). Fromacademic to media capital: To what extent does the scientific reputation of universities translate into Wikipedia attention?. Journal of the Association for Information Science and Technology.

Besançon, L., Peiffer-Smadja, N., Segalas, C., Jiang, H., Masuzzo, P., Smout, C., Billy, E., Deforet, M., & Leyrat, C. (2021). Open science saves lives: lessons from the COVID-19 pandemic. BMCMedical Research Methodology, 21(1), 117.

Buttliere, B., & Wicherts, J. (2013). What next for scientific communication? A large scale survey of psychologists on problems and potential solutions. Masterʼs Thesis.

Buttliere, B. T. (2014). Using science and psychology to improve the dissemination and evaluation of scientific work. Frontiers in Computational Neuroscience, 8, 82.

Buttliere, B., & Wicherts, J. (2014). Opinions on the value of direct replication: A survey of 2,000 psychologists. PsyArxiv

Buttliere, B., & Buder, J. (2017). Personalizing papers using Altmetrics: Comparing paper ʻQuality ʼ or ʻImpactʼ to person ʻIntelligenceʼ or ʻPersonalityʼ. Scientometrics, 111(1), 219-239.

Buder, J., Zimmermann, A., Buttliere, B., Rabl, L., Vogel, M., & Huff, M. (2023). Online interaction turns the congeniality bias into an uncongeniality bias. Psychological Science, 34(10), 1055-1068.

Chen, Y., Farzan, R., Kraut, R., YeckehZaare, I., &Zhang, A. F. (2023). Motivating experts to contribute to digital public goods: A personalized field experiment on Wikipedia. Management Science, mnsc.2023.4852.

Impact. (2022, August 12). Wiki Education.

Jemielniak, D., & Aibar, E. (2016). Bridging the gap between wikipedia and academia. Journal of the Association for Information Science and Technology, 67(7), 1773–1776.

Kincaid, D. W., Beck, W. S., Brandt, J. E., Mars Brisbin, M., Farrell, K. J., Hondula, K. L., Larson, E. I., & Shogren, A. J. (2021). Wikipedia can help resolve information inequality in the aquatic sciences. Limnology and Oceanography Letters, 6(1), 18–23.

Konieczny, P. (2016). Teaching with Wikipedia in a 21 st-century classroom: Perceptions of Wikipedia and its educational benefits. Journal of the Association for Information Science and Technology, 67(7), 1523–1534.

Kousha, K., & Thelwall, M. (2017). Are Wikipedia citations important evidence of the impact of scholarly articles and books? Journal of the Association for Information 10Science and Technology, 68(3), 762–779.

Oravec, J. A. (2019). The “Dark Side” of Academics? Emerging Issues in the GamingandManipulation of Metrics in Higher Education. The Review of Higher Education, 42(3), 859–877.

Poulter, M., & Sheppard, N. (2020). Wikimedia and universities: contributing to the global commonsintheAgeofDisinformation. Insights: the UKSG Journal, 33(1).

Taraborelli, D., Mietchen, D., Alevizou, P., & Gill, A.J. (2011). Expert participation on Wikipedia: barriers and opportunities [Conference presentation]. Wikimania 2011.

Haifa, Israel. mmons/4/4f/Expert_Participation_Survey__Wikimania_2011.pdf.

Vetter, M. A., McDowell, Z. J., & Stewart, M. (2019). From opportunities to outcomes: The Wikipedia-based writing assignment. Computers and Composition, 52, 53-64.

Kousha, K., & Thelwall, M. (2017). Are Wikipedia citations important evidence of the impact of scholarly articles and books? Journal of the Association for Information Science and Technology, 68(3), 762–779.

Zagorova, O., Ulloa, R., Weller, K., & Flöck, F. (2022). “I updated the ”: The evolution of references in the English Wikipedia and the implications for altmetrics. Quantitative Science Studies, 3(1), 147–173.