Research:Developing Wikimedia Impact Metrics as a Sociotechnical Solution for Encouraging Funder and Academic Engagement
This page documents a research project in progress.
Information may be incomplete and change as the project progresses.
Please contact the project lead before formally citing or reusing results from this page.
Introduction
editAccess the full Stage II proposal on Open Review: https://openreview.net/attachment?id=9AhIrXEmDB&name=stage_two_submission
The main problems we seek to solve are the low levels of academic engagement in Wikimedia, and the (lack of) professional recognition that academics and volunteers receive for engaging (Jemielniak & Aibar, 2016; Konieczny, 2016). Our project addresses these by researching, developing, and making available so-called Wikimedia Impact Metrics, which demonstrate the impact of contributions. In conjunction with this, we aim to build a consortium of researchers in the academic community to push funders and universities to value these contributions and apply for larger grants.
The basic argument is that scientists want to maketheir knowledge open, and Wikimedia is where the public looks for information, but few scientists engage with Wikimedia. Research suggests this is in part due to a lack of incentives to engage (Chen et al., 2023; Kincaid et al., 2021; Taraborelli et al., 2011), and thus we develop Wikimedia Impact Metrics and encourage funders to value them as grant outcomes. At the end of the grant, we expect to have built a community with the survey, developed some basic metrics, and a demonstrator that the community canprovide feedback on.
About the research team
editBrett Buttliere
editis an Assistant Professor at the Centre for European, Regional, and Local Studies, University of Warsaw. He has done research on the history of science using Wikimedia, and on how Wikimedia can help make science open. He works on digital infrastructure for science in general, especially by keeping in mind the psychology of the scientist, since they are the ones actually doing the science and making the decisions.
Matthew Vetter
editis a Professor of English at Indiana University of Pennsylvania. He co-chairs the CCCC Wikipedia Initiative, a disciplinary project to involve more English Studies academics to contribute to Wikimedia via WikiProject Writing. He is the author, with Zach McDowell, of Wikipedia and the Representation of Reality (Routledge, 2021). He is a veteran instructor with Wiki Education, as well as a researcher on Wikipedia-based education. He has also served on the Wikimedia Foundationʼs North America regional grants committee for the past three years.
Sage Ross
editis Chief Technology Officer of Wiki Education and the main developer behind the P&EDashboard. He has developed many metrics related to Wikimedia and its utilization, and the P&E Dashboard has been used across hundreds of initiatives, programs and systems wikimedia wide. He has extensive background and experience with developing websites and successful Wikimedia tools.
Methods
editThere are four specific goals of the project,
- Survey and interview Wikimedians to understand what metrics are good for indicating impact.
- Develop some basic metrics of impact including contributions, pageviews, clickthroughs, and others.
- Develop a minimally viable product (MVP) that presents some basic statistics such as edits and views.
- Develop and apply for larger grants including conference and EU grants for e.g., science communication.
Our goal is to understand how Wikimedia can strengthen its position in the knowledge ecosystem through the engagement of academics and other experts, especially through the development of metrics of impact. This goal is particularly relevant because it provides professional credit to those already in the movementaswell as encouraging others to join (Strategy 2030 1.1: Support Volunteers).
We approach this topic from three directions, each corresponding to one co-PI, with each estimated at approximately 3-4 months work.
Methods for the survey
editThe idea is to survey Wikimedians, and organizers of events at academic associations (as represented by the organizer), on which metrics they think indicate impact. The sample for the survey will come from two sources. First, our project team is currently engaged in a Rapid Grant where we examine and collect data on how various professional organizations are engaging with Wikimedia. This project involves surveying 200 academic associationʼs websites for Wikimedia related activities and identifying the people in these organizations who have engaged or helped develop Wikimedia related activities within those organizations (so called organizational champions). Aside from these organizational champions, an identical but separate survey will be spread throughout our networks, especially the Wikimedia Research Group, Wikimedia Education, Edu Wiki User Group, CCCC Wikipedia Initiative / Wiki Project Writing, OpenCon, SPARCOpenForum. Additionally, Chris Shilling has offered to let us email those they have funded in the past, to ask how they would measure their impact. Finally, if needed, we intend to leverage the network of Jamie Mathewson at Wiki Education to identify champions who might have interest in the topic to take the survey. The current draft of the survey is available here, and we welcome feedback on its further development including any particular questions to be asked or themes to be investigated.
Survey and interview inclusion criteria
edit- Academic scientists and/or researchers with some previous and/or future interest in engaging Wikimedia projects (Wikipedia, Wikimedia Commons, Wikisource, Wikidata) as part of their research or professional activity
- English speaking participants
Methods of subject selection and recruitment
editWe intend to use the following procedures for methods of subject selection. First, we will share an advertisement about the study (attached to this protocol) across multiple digital channels, including listservs related to Wikimedia projects, social media sites LinkedIn, Facebook (Wikipedia Weekly group), and by soliciting specific targeted individuals.
Methods and procedures applied to human subjects
editProspective participants will first read the study advertisement via various online channels (social media, specialized listservs).Those who identified that they meet the inclusion criteria will respond to the survey. They will first read the consent form. Those who agreed to participate will be asked to complete a survey consisting of a series of qualitative and quantitative questions. The survey will take approximately 10-15 minutes. At the end of the survey, participants will be invited to take part in a subsequent interview. If willing, they will provide their contact (email address). 30 minutes Interviews will be conducted via Zoom on roughly 5-10 participants.
Risks and benefits
editThere will be minimal risk to all participants when they are completing the survey and interview.Human subjects will contribute to the development of an impact metric tool for Wikimedia engagement, which may positively benefit their professional activities in the future.
Privacy and confidentiality
editThe researcher will keep and store the data on a password-protected website on Qualtrics. Thus, this means that only the researcher will be able to log on to see the results from the survey. In addition, the data will be saved on a password-locked laptop. Participants who participate in the survey but who do not choose to share their contact information (for interview or other follow-up) will remain completely anonymous but those who do choose to share their contact information will not be anonymous but will be kept confidential to the researchers.Interview data will be recorded and transcribed. But both recordings and transcripts will be kept on password protected websites and computers and shared only with PIs Buttliere and Vetter.
Consent
editConsent process for survey: Consent form will be placed at the beginning of Qualtrics survey.. Participants will read through the consent form and then indicate if they want to take the survey or not. If they say "Yes, I give my consent", they will be taken to the beginning of the survey. However, if they say "No, I don't give my consent", then they will not take the survey.
Consent process for interview: Survey participants will be given the opportunity to express interest in an interview as a question in the survey. PIs will email those survey participants who expressed interest with a separate informed consent document for the interview portion of the study. Once their completed informed consent document is returned, we will schedule the interview with them.
Methods for metric development
editThe goal is to develop bibliometric indicators for researchers. Establishing effective bibliometrics involves a series of standard steps and set points which are used to understand the newtools and in comparison to other metrics. Aside from these individual metrics are necessary considerations at the aggregate level. The current P&E dashboard is for looking across users, and it will likely be of use to funders and decision makers to be able to see e.g., across several years of their funding efforts whois having impact, and in what areas. Currently, our team is considering metrics of contributions and metrics of impact, as laid out in Table 1. Many of the metrics are ʻstandardʼ in the sense that they are already made available through the P&E Dashboard.
Contribution | Impact |
---|---|
Words added | Views |
Articles added | Views |
References | Click-throughs |
Wikilinks | Click-throughs |
External links | Click-throughs |
DOIs added | Click-throughs |
Wikimedia uploads | Downloads |
These two different types of metrics match fairly well onto existing metrics both in science and e.g., online business (views and click throughs). In science, these can be the creation of data or papers, and then citations.
A more important aspect of this work will be determining which edits are relevant to the grant and which are not,especially when users have multiple ongoing grants for which not all edits are relevant. Currently, the P&E Dashboard allows users to choose target articles and categories to include in metrics, but we suspect that we will also need to be able to select edits to be e.g., within the grant date, and also somemechanismswill need to be considered to prevent double dipping. Finally, it will be important to think about how the metrics can and will be “gamed” (Oravec, 2019) or used (and to work toward its prevention).
Methods for tool development
editThe main goal of WP3 is the development of a Minimally Viable Product (MVP) that can be presented to the community for feedback. This project will be based on the P&E Dashboard. The P&E Dashboard is mostly used ʻinternallyʼ, within Wikimedia or within classrooms or particular programs or events to track contributions and impact. The goal here will be to rework it in small ways such that it is more accessible and useful for individual researchers, universities, and funders, compared to Wikimedia program directors looking to track developments.
Each campaign can be a grant class. This reworking will involve renaming some various portions, and most importantly, enabling authors to easily package particular edits as ʻproject relevantʼ in a way that then funders or universities can examine and isolate as parts of e.g., tenure packets or grant outcomes. This is the major research and development part of the project. The idea is to stick as closely as possible to the P&EDashboard, and mostly focusing on making it more administration and evaluation friendly. Most of the metrics that are used on the P&E dashboard will remain, though we look to develop them with the results of the survey. Wedoexpecttodevelop a newwebsite to accommodate this tool. Wikimediaimpact.com and Wikimediaimpact.org are available for approximately 50$ per year. Developing this functionality will serve two purposes, since the technology will also be implemented into the P&E Dashboard, such that editors will have better control over which of their edits are program and event relevant.
Presenting, team building, and grant writing
editA significant portion of the work is presenting the work, team building, networking, and identifying people to further develop the tool with. It does not really make sense to build this tool unless we will market and advertise it so that it can be effectively used. Weexpect at the end of the year to have presented the project at least three times at the Wiki Workshop, Wikimania 2024 (Katowice), as well as at the Research Evaluation in the Social Sciences and Humanities Conference in Galway, Ireland. These opportunities represent excellent venues to gather opinions both within the Wikimedia Community as well as the Research Evaluation community. Building this team is more than side or part time work, and wechangedthe project from Stage 1 to make this work moree xplicit, since it is not an insignificant amount of work to build a team or apply for large scale grants. These work packages combine quantitative, investigative, and qualitative research to solidify Wikimediaʼs position as an Open Knowledge platform, positioning Wikipedia as an interface between Science and the Public.
Expected output
editThis grant will support the development of several expected outputs, including at least one scientific paper outlining the argumentation and survey results, the development of the WikimediaImpact.org website, several conference papers, and the development and submission of at least one grant to support continued development.
Publish an academic paper
editOur goal is to publish a paper and several conference papers outlining the work and idea. The goal of the main paper will be to have widespread impact and an introduction to the tool with examples making the argument in a major journal. We will target high impact journals such as Science or Nature, but also concede that the paper may be published in Scientometrics, Quantitative Science Studies, or other venues where bibliometrics are valued.
Minimally viable website and tool
editCreating the metrics and functionality are not useful if they are not made available to the public. Thus, we expect to develop a minimally viable product (MVP) which is not intended to be afinalproduct, but the most basic functions possible, which we can receive feedback on. Conference presentations and panel As outlined in the Methods section, we expect at the end of the year to have presented it at least three times at the Wiki Workshop, Wikimania 2024 (Katowice), as well as at the Research Community impact plan Evaluation in the Social Sciences and Humanities Conference in Galway, Ireland or other relevant conference.
Press/Media
edit- "Vetter Awarded Research Grant from Wikimedia Foundation." English Department News. Indiana University of Pennsylvania .
Timeline
editTBD
Policy, Ethics and Human Subjects Research
editTHIS PROJECT HAS BEEN APPROVED BY THE INDIANA UNIVERSITY OF PENNSYLVANIA INSTITUTIONAL REVIEW BOARD FOR THE PROTECTION OF HUMAN SUBJECTS (PHONE 724.357.7730).
Results
editSurvey results overview
editWe launched the survey in July 2024 so that we could advertise and recruit people while at Wikimania 2024. At the same time we took advantage of having everyone together, in Europe, to advertise our COST Action. In the end we have more than 100 respondents to the survey, mostly recruited from the 4Cs initiative and the WikiResearch email lists, as well as diverse participants from the handouts and personal meetings at WikiMania and WikiCon North America meetings.
The participants in our survey include researchers, academics, librarians, scientific publishers, private employees, WikiMedians in Residence, and volunteers more generally. The goal is to help every professional Wikimedian get credit for the work they are doing.
The survey reveals that a significant number of respondents participate in the Wikimedia projects and other initiatives in their professional work, with 49% doing so extensively. Many of the participants indicated using Wikimedia within the context of their educational work, teaching students using Wikimedia assignments for instance.
However, a notable challenge is the lack of formal rewards or recognition for these contributions within their professional contexts, as 30% reported not being rewarded at all.
Several factors contribute to Wikimedia engagement not being valued as much as traditional academic activities, including perceptions of Wikipedia's reliability (65%) and a lack of formal recognition or incentives (53%). Despite these challenges, researchers do communicate their Wikimedia impact through various channels, with informal conversations being the most common (70%). Beyond Wikimedia, many respondents are also active in other open initiatives like open access publishing and open data. While there are mixed views on how actively academic institutions are engaging with or recognizing the value of Wikimedia, the primary reasons for resistance to engaging with Wikimedia among scientists and academics include time constraints (75%) and concerns about information reliability (70%), a finding that complicates previous research regarding growing acceptance of Wikipedia among academics (Malik et al., 2023). A particularly significant finding involved academics ranking of metrics for measuring Wikimedia contributions; respondents highlighted the importance of new editors recruited or trained, references to work in their field, new articles, total views, word count, and edit count (Table 1).
Metric item | Mean value (likert) |
---|---|
New editors recruited/retained | 3.68 |
Refs to work in [respondent's] field | 3.64 |
New articles added | 3.64 |
Total views | 3.34 |
Total words contributed | 3.15 |
Edit count | 3.09 |
Focus group results overview
editIn the focus group interviews, themes revealed several potential avenues for enhancing academic engagement with Wikimedia, focusing on tool designs and incentives. Participants explored the idea of an Individual Researcher Profile, akin to ORCID, to showcase diverse Wikimedia contributions. They also suggested a simplified version of the Programs and Events Dashboard for individual editors. A customizable dashboard with various metrics and export functions was proposed to allow researchers to curate and present their contributions effectively. Furthermore, Project-Based Reporting was discussed as a way to highlight the impact of specific research projects. The creation of a Metrics Visualizer was suggested to make impact data more accessible, and an "Altmetric" Integration to track the broader impact of research on Wikipedia. Regarding incentives, analyzing the "Thank" feature and implementing a review system were considered. Academics value metrics such as new articles added, total views, and references to work in their field. However, concerns were raised about the possibility of “gaming” metrics, the visibility of Wikimedia work, the relevance of edits, and the burden of creating yet another system for academics to engage and update. The focus group sessions, overall, emphasized the importance of motivating academics by demonstrating impact and educating them on the meaning of different metrics.
New Version of the Programs and Events Dashboard
editPerhaps the major outcome of the grant will be the new and improved version of the Programs and Events Dashboard. This 'Individual Instance' is more focused on the impacts and contributions of individuals, rather than of groups of people during events or classes.
Research People List
editAnother outcome of this grant was the development of a list of researchers that are associated with Wikimedia. This was done in partnership with our collaborators Iolanda Pensa and Kinneret Gordon. The list now has over 140 people on it along with where they are from and what types of work they are doing.
This list also formed the basis of the WikiScience COST Action and the WikiScience Hub which are in the process.
WikiScience Hub
editThe final major portion of the work has involved the development and initial planning of a WikiScience Hub, which is beginning this year along with representatives from each of the Regions that Wikimedia Funds in.
Discussion
editThe results of this study, encompassing both focus group discussions and survey responses, carry significant implications for enhancing academic and professional engagement with Wikimedia projects. One key implication is the clear need for improved mechanisms to showcase and measure academic contributions to Wikimedia in ways that are valued within academic and professional circles. The focus group transcripts reveal a desire for tools like an Individual Researcher Profile and a customizable dashboard that go beyond simple edit counts to highlight meaningful contributions such as new articles, references added, and total views. The survey results corroborate this, indicating that academics value metrics like references to their work, new articles added, and new editors recruited or trained. This suggests that a shift towards more nuanced and academically relevant metrics is necessary to incentivize engagement.
Another major implication is the persistence of skepticism and a lack of understanding about Wikimedia within academia. The survey highlights that significant reasons for resistance include concerns about information reliability (70%) and a limited academic recognition or incentives for contributing (48%). This underscores the need for Wikimedia to actively address these perceptions and demonstrate the scholarly value of contributing. Potential tools like Project-Based Reporting with the possibility of a DOI could help bridge this gap by linking Wikimedia contributions more directly to academic outputs. The study also reveals a disconnect between the current methods of recognizing academic work and contributions to Wikimedia.
While a substantial portion of researchers engage in public knowledge sharing, a significant percentage report receiving no formal rewards or recognition for these efforts. This lack of institutional support suggests that academic institutions need to develop clearer pathways for valuing and rewarding Wikimedia contributions in promotion and tenure processes. Furthermore, the study points to the importance of user-friendly tools for academics to track and communicate their impact. The suggestion of an Individual Researcher Profile in the focus groups addresses the need for data to be presented in a more digestible and engaging way for audiences beyond the Wikimedia community, such as in academia. This is crucial as the survey shows that academics currently rely on informal conversations and professional documents to communicate their Wikimedia engagement. A dedicated tool could streamline this process and provide more compelling evidence of their contributions' impact. As a follow-up to our research, our team is currently investigating a modification to the Program & Events Dashboard to include an ”Individual Profile” feature which has the potential to accomplish some of these goals. Finally, our study also cautions against potential pitfalls. The focus groups raised concerns about the potential of academics “gaming” (false inflation) metrics, which needs to be addressed in the design of any new tools. Additionally, the survey indicates that academics already use multiple systems to track their impact, so any new tool should ideally be integrated with existing workflows or offer a significantly improved value proposition to avoid creating additional burden.
In summary, the implications of this study highlight a clear demand for better tools and recognition systems to incentivize academic and other professional engagement with Wikimedia. Incentivization cannot be accomplished through symbolic validation alone, because they work in too competitive of markets (Gallus, 2017). Additionally, the metrics that are commonly used are not cared about in professional circles. Better metrics might be possible, but these should focus on showcasing academically valued contributions, addressing concerns about reliability, integrating with existing academic practices, and providing user-friendly ways to track and communicate impact. Tools like the proposed Wikimedia Impact Visualizer, and Wikimedia Impact Tracker have the potential to play crucial roles in this by presenting data in a way that resonates with academic values and demonstrates the broader impact of their Wikimedia work.
Resources
edit- Broad survey announcement document: https://docs.google.com/document/d/1fPMVDGjM8pBCEOU2sEwENtoCuZVzX7Gx/edit?usp=sharing&ouid=113182016009423657566&rtpof=true&sd=true
- Survey export: https://docs.google.com/document/d/1aK80FyDbPiyGB8rWZ2s1-MxZlWp-HpfC/edit?usp=sharing&ouid=113182016009423657566&rtpof=true&sd=true
References
edit
Arroyo-Machado, W., Díaz-Faes, A. A., Herrera-Viedma, E., & Costas, R. (2023). Fromacademic to media capital: To what extent does the scientific reputation of universities translate into Wikipedia attention?. Journal of the Association for Information Science and Technology.
Besançon, L., Peiffer-Smadja, N., Segalas, C., Jiang, H., Masuzzo, P., Smout, C., Billy, E., Deforet, M., & Leyrat, C. (2021). Open science saves lives: lessons from the COVID-19 pandemic. BMCMedical Research Methodology, 21(1), 117. https://doi.org/10.1186/s12874-021-01304-y
Buttliere, B., & Wicherts, J. (2013). What next for scientific communication? A large scale survey of psychologists on problems and potential solutions. Masterʼs Thesis.
Buttliere, B. T. (2014). Using science and psychology to improve the dissemination and evaluation of scientific work. Frontiers in Computational Neuroscience, 8, 82.
Buttliere, B., & Wicherts, J. (2014). Opinions on the value of direct replication: A survey of 2,000 psychologists. PsyArxiv
Buttliere, B., & Buder, J. (2017). Personalizing papers using Altmetrics: Comparing paper ʻQuality ʼ or ʻImpactʼ to person ʻIntelligenceʼ or ʻPersonalityʼ. Scientometrics, 111(1), 219-239.
Buder, J., Zimmermann, A., Buttliere, B., Rabl, L., Vogel, M., & Huff, M. (2023). Online interaction turns the congeniality bias into an uncongeniality bias. Psychological Science, 34(10), 1055-1068.
Chen, Y., Farzan, R., Kraut, R., YeckehZaare, I., &Zhang, A. F. (2023). Motivating experts to contribute to digital public goods: A personalized field experiment on Wikipedia. Management Science, mnsc.2023.4852. https://doi.org/10.1287/mnsc.2023.4852
Impact. (2022, August 12). Wiki Education. https://wikiedu.org/impact/
Jemielniak, D., & Aibar, E. (2016). Bridging the gap between wikipedia and academia. Journal of the Association for Information Science and Technology, 67(7), 1773–1776. https://doi.org/10.1002/asi.23691
Kincaid, D. W., Beck, W. S., Brandt, J. E., Mars Brisbin, M., Farrell, K. J., Hondula, K. L., Larson, E. I., & Shogren, A. J. (2021). Wikipedia can help resolve information inequality in the aquatic sciences. Limnology and Oceanography Letters, 6(1), 18–23. https://doi.org/10.1002/lol2.10168
Konieczny, P. (2016). Teaching with Wikipedia in a 21 st-century classroom: Perceptions of Wikipedia and its educational benefits. Journal of the Association for Information Science and Technology, 67(7), 1523–1534. https://doi.org/10.1002/asi.23616
Kousha, K., & Thelwall, M. (2017). Are Wikipedia citations important evidence of the impact of scholarly articles and books? Journal of the Association for Information 10Science and Technology, 68(3), 762–779. https://doi.org/10.1002/asi.23694
Oravec, J. A. (2019). The “Dark Side” of Academics? Emerging Issues in the GamingandManipulation of Metrics in Higher Education. The Review of Higher Education, 42(3), 859–877. https://muse.jhu.edu/pub/1/article/720763
Poulter, M., & Sheppard, N. (2020). Wikimedia and universities: contributing to the global commonsintheAgeofDisinformation. Insights: the UKSG Journal, 33(1). https://eprints.whiterose.ac.uk/160126/
Taraborelli, D., Mietchen, D., Alevizou, P., & Gill, A.J. (2011). Expert participation on Wikipedia: barriers and opportunities [Conference presentation]. Wikimania 2011.
Haifa, Israel. https://upload.wikimedia.org/wikipedia/co mmons/4/4f/Expert_Participation_Survey__Wikimania_2011.pdf.
Vetter, M. A., McDowell, Z. J., & Stewart, M. (2019). From opportunities to outcomes: The Wikipedia-based writing assignment. Computers and Composition, 52, 53-64.
Kousha, K., & Thelwall, M. (2017). Are Wikipedia citations important evidence of the impact of scholarly articles and books? Journal of the Association for Information Science and Technology, 68(3), 762–779. https://doi.org/10.1002/asi.23694
Zagorova, O., Ulloa, R., Weller, K., & Flöck, F. (2022). “I updated the ”: The evolution of references in the English Wikipedia and the implications for altmetrics. Quantitative Science Studies, 3(1), 147–173. https://doi.org/10.1162/qss_a_00171