Research:Developing Wikimedia Impact Metrics as a Sociotechnical Solution for Encouraging Funder and Academic Engagement
This page documents a research project in progress.
Information may be incomplete and change as the project progresses.
Please contact the project lead before formally citing or reusing results from this page.
Introduction
editAccess the full Stage II proposal on Open Review: https://openreview.net/attachment?id=9AhIrXEmDB&name=stage_two_submission
The main problems we seek to solve are the low levels of academic engagement in Wikimedia, and the (lack of) professional recognition that academics and volunteers receive for engaging (Jemielniak & Aibar, 2016; Konieczny, 2016). Our project addresses these by researching, developing, and making available so-called Wikimedia Impact Metrics, which demonstrate the impact of contributions. In conjunction with this, we aim to build a consortium of researchers in the academic community to push funders and universities to value these contributions and apply for larger grants.
The basic argument is that scientists want to maketheir knowledge open, and Wikimedia is where the public looks for information, but few scientists engage with Wikimedia. Research suggests this is in part due to a lack of incentives to engage (Chen et al., 2023; Kincaid et al., 2021; Taraborelli et al., 2011), and thus we develop Wikimedia Impact Metrics and encourage funders to value them as grant outcomes. At the end of the grant, we expect to have built a community with the survey, developed some basic metrics, and a demonstrator that the community canprovide feedback on.
About the research team
editBrett Buttliere
editis an Assistant Professor at the Centre for European, Regional, and Local Studies, University of Warsaw. He has done research on the history of science using Wikimedia, and on howWikimediacanhelp makescience open. He works ondigital infrastructure for science in general, especially by keeping in mind the psychology of the scientist, since they are the ones actually doing the science and making the decisions.
Matthew Vetter
editis a Professor of English at Indiana University of Pennsylvania. He co-chairs the CCCC Wikipedia Initiative, a disciplinary project to involve more English Studies academics to contribute to Wikimedia via WikiProject Writing. He is the author, with Zach McDowell, of Wikipedia and the Representation of Reality (Routledge, 2021). He is a veteran instructor with Wiki Education, as well as a researcher on Wikipedia-based education. Hehasalso served on the Wikimedia Foundationʼs North America regional grants committee for the past three years.
Sage Ross
editis Chief Technology Officer of Wiki Education and the main developer behind the P&EDashboard. Hehasdeveloped many metrics related to Wikimedia and its utilization, and the P&EDashboardhasbeenusedacross hundreds of initiatives, programs and systems wikimedia wide. He has extensive background and experience with developing websites and successful Wikimedia tools.
Methods
editThere are four specific goals of the project,
- Survey and interview Wikimedians to understand what metrics are good for indicating impact.
- Develop some basicm etrics of impact including contributions, pageviews, clickthroughs, and others.
- Develop a minimally viable product (MVP) that presents some basic statistics such as edits and views.
- Develop and apply for larger grants including conference and EU grants for e.g., science communication.
Our goal is to understand how Wikimedia can strengthen its position in the knowledge ecosystem through the engagement of academics and other experts, especially through the development of metrics of impact. This goal is particularly relevant because it provides professional credit to those already in the movementaswell as encouraging others to join (Strategy 2030 1.1: Support Volunteers).
We approach this topic from three directions, each corresponding to one co-PI, with each estimated at approximately 3-4 months work.
Methods for the survey
editThe idea is to survey Wikimedians, and organizers of events at academic associations (as represented by the organizer), on which metrics they think indicate impact. The sample for the survey will come from two sources. First, our project team is currently engaged in a Rapid Grant where we examine and collect data on how various professional organizations are engaging with Wikimedia. This project involves surveying 200 academic associationʼs websites for Wikimedia related activities and identifying the people in these organizations who have engaged or helped develop Wikimedia related activities within those organizations (so called organizational champions). Aside from these organizational champions, an identical but separate survey will be spread throughout our networks, especially the Wikimedia Research Group, Wikimedia Education, Edu Wiki User Group, CCCC Wikipedia Initiative / Wiki Project Writing, OpenCon, SPARCOpenForum. Additionally, Chris Shilling has offered to let us email those they have funded in the past, to ask how they would measure their impact. Finally, if needed, we intend to leverage the network of Jamie Mathewson at Wiki Education to identify champions who might have interest in the topic to take the survey. The current draft of the survey is available here, and we welcome feedback on its further development including any particular questions to be asked or themes to be investigated.
Survey and interview inclusion criteria
edit- Academic scientists and/or researchers with some previous and/or future interest in engaging Wikimedia projects (Wikipedia, Wikimedia Commons, Wikisource, Wikidata) as part of their research or professional activity
- English speaking participants
Methods of subject selection and recruitment
editWe intend to use the following procedures for methods of subject selection. First, we will share an advertisement about the study (attached to this protocol) across multiple digital channels, including listservs related to Wikimedia projects, social media sites LinkedIn, Facebook (Wikipedia Weekly group), and by soliciting specific targeted individuals.
Methods and procedures applied to human subjects
editProspective participants will first read the study advertisement via various online channels (social media, specialized listservs).Those who identified that they meet the inclusion criteria will respond to the survey. They will first read the consent form. Those who agreed to participate will be asked to complete a survey consisting of a series of qualitative and quantitative questions. The survey will take approximately 10-15 minutes. At the end of the survey, participants will be invited to take part in a subsequent interview. If willing, they will provide their contact (email address). 30 minutes Interviews will be conducted via Zoom on roughly 5-10 participants.
Risks and benefits
editThere will be minimal risk to all participants when they are completing the survey and interview.Human subjects will contribute to the development of an impact metric tool for Wikimedia engagement, which may positively benefit their professional activities in the future.
Privacy and confidentiality
editThe researcher will keep and store the data on a password-protected website on Qualtrics. Thus, this means that only the researcher will be able to log on to see the results from the survey. In addition, the data will be saved on a password-locked laptop. Participants who participate in the survey but who do not choose to share their contact information (for interview or other follow-up) will remain completely anonymous but those who do choose to share their contact information will not be anonymous but will be kept confidential to the researchers.Interview data will be recorded and transcribed. But both recordings and transcripts will be kept on password protected websites and computers and shared only with PIs Buttliere and Vetter.
Consent
editConsent process for survey: Consent form will be placed at the beginning of Qualtrics survey.. Participants will read through the consent form and then indicate if they want to take the survey or not. If they say "Yes, I give my consent", they will be taken to the beginning of the survey. However, if they say "No, I don't give my consent", then they will not take the survey.
Consent process for interview: Survey participants will be given the opportunity to express interest in an interview as a question in the survey. PIs will email those survey participants who expressed interest with a separate informed consent document for the interview portion of the study. Once their completed informed consent document is returned, we will schedule the interview with them.
Methods for metric development
editThe goal is to develop bibliometric indicators for researchers. Establishing effective bibliometrics involves a series of standard steps and set points which are used to understand the newtools and in comparison to other metrics. Aside from these individual metrics are necessary considerations at the aggregate level. The current P&E dashboard is for looking across users, and it will likely be of use to funders and decision makers to be able to see e.g., across several years of their funding efforts whois having impact, and in what areas. Currently, our team is considering metrics of contributions and metrics of impact, as laid out in Table 1. Many of the metrics are ʻstandardʼ in the sense that they are already made available through the P&E Dashboard.
Contribution | Impact |
---|---|
Words added | Views |
Articles added | Views |
References | Click-throughs |
Wikilinks | Click-throughs |
External links | Click-throughs |
DOIs added | Click-throughs |
Wikimedia uploads | Downloads |
These two different types of metrics match fairly well onto existing metrics both in science and e.g., online business (views and click throughs). In science, these can be the creation of data or papers, and then citations.
A more important aspect of this work will be determining which edits are relevant to the grant and which are not,especially when users have multiple ongoing grants for which not all edits are relevant. Currently, the P&E Dashboard allows users to choose target articles and categories to include in metrics, but we suspect that we will also need to be able to select edits to be e.g., within the grant date, and also somemechanismswill need to be considered to prevent double dipping. Finally, it will be important to think about how the metrics can and will be “gamed” (Oravec, 2019) or used (and to work toward its prevention).
Methods for tool development
editThe main goal of WP3 is the development of a Minimally Viable Product (MVP) that can be presented to the community for feedback. This project will be based on the P&E Dashboard. The P&E Dashboard is mostly used ʻinternallyʼ, within Wikimedia or within classrooms or particular programs or events to track contributions and impact. The goal here will be to rework it in small ways such that it is more accessible and useful for individual researchers, universities, and funders, compared to Wikimedia program directors looking to track developments.
Each campaign can be a grant class. This reworking will involve renaming some various portions, and most importantly, enabling authors to easily package particular edits as ʻproject relevantʼ in a way that then funders or universities can examine and isolate as parts of e.g., tenure packets or grant outcomes. This is the major research and development part of the project. The idea is to stick as closely as possible to the P&EDashboard, and mostly focusing on making it more administration and evaluation friendly. Most of the metrics that are used on the P&E dashboard will remain, though we look to develop them with the results of the survey. Wedoexpecttodevelop a newwebsite to accommodate this tool. Wikimediaimpact.com and Wikimediaimpact.org are available for approximately 50$ per year. Developing this functionality will serve two purposes, since the technology will also be implemented into the P&E Dashboard, such that editors will have better control over which of their edits are program and event relevant.
Presenting, team building, and grant writing
editA significant portion of the work is presenting the work, team building, networking, and identifying people to further develop the tool with. It does not really make sense to build this tool unless we will market and advertise it so that it can be effectively used. Weexpect at the end of the year to have presented the project at least three times at the Wiki Workshop, Wikimania 2024 (Katowice), as well as at the Research Evaluation in the Social Sciences and Humanities Conference in Galway, Ireland. These opportunities represent excellent venues to gather opinions both within the Wikimedia Community as well as the Research Evaluation community. Building this team is more than side or part time work, and wechangedthe project from Stage 1 to make this work moree xplicit, since it is not an insignificant amount of work to build a team or apply for large scale grants. These work packages combine quantitative, investigative, and qualitative research to solidify Wikimediaʼs position as an Open Knowledge platform, positioning Wikipedia as an interface between Science and the Public.
Expected output
editThis grant will support the development of several expected outputs, including at least one scientific paper outlining the argumentation and survey results, the development of the WikimediaImpact.org website, several conference papers, and the development and submission of at least one grant to support continued development.
Publish an academic paper
editOur goal is to publish a paper and several conference papers outlining the work and idea. The goal of the main paper will be to have widespread impact and an introduction to the tool with examples making the argument in a major journal. We will target high impact journals such as Science or Nature, but also concede that the paper may be published in Scientometrics, Quantitative Science Studies, or other venues where bibliometrics are valued.
Minimally viable website and tool
editCreating the metrics and functionality are not useful if they are not made available to the public. Thus, we expect to develop a minimally viable product (MVP) which is not intended to be afinalproduct, but the most basic functions possible, which we can receive feedback on. Conference presentations and panel As outlined in the Methods section, we expect at the end of the year to have presented it at least three times at the Wiki Workshop, Wikimania 2024 (Katowice), as well as at the Research Community impact plan Evaluation in the Social Sciences and Humanities Conference in Galway, Ireland or other relevant conference.
Press/Media
edit- "Vetter Awarded Research Grant from Wikimedia Foundation." English Department News. Indiana University of Pennsylvania .
Timeline
editTBD
Policy, Ethics and Human Subjects Research
editTHIS PROJECT HAS BEEN APPROVED BY THE INDIANA UNIVERSITY OF PENNSYLVANIA INSTITUTIONAL REVIEW BOARD FOR THE PROTECTION OF HUMAN SUBJECTS (PHONE 724.357.7730).
Results
editTBD
Resources
edit- Broad survey announcement document: https://docs.google.com/document/d/1fPMVDGjM8pBCEOU2sEwENtoCuZVzX7Gx/edit?usp=sharing&ouid=113182016009423657566&rtpof=true&sd=true
- Survey export: https://docs.google.com/document/d/1aK80FyDbPiyGB8rWZ2s1-MxZlWp-HpfC/edit?usp=sharing&ouid=113182016009423657566&rtpof=true&sd=true
References
edit
Arroyo-Machado, W., Díaz-Faes, A. A., Herrera-Viedma, E., & Costas, R. (2023). Fromacademic to media capital: To what extent does the scientific reputation of universities translate into Wikipedia attention?. Journal of the Association for Information Science and Technology.
Besançon, L., Peiffer-Smadja, N., Segalas, C., Jiang, H., Masuzzo, P., Smout, C., Billy, E., Deforet, M., & Leyrat, C. (2021). Open science saves lives: lessons from the COVID-19 pandemic. BMCMedical Research Methodology, 21(1), 117. https://doi.org/10.1186/s12874-021-01304-y
Buttliere, B., & Wicherts, J. (2013). What next for scientific communication? A large scale survey of psychologists on problems and potential solutions. Masterʼs Thesis.
Buttliere, B. T. (2014). Using science and psychology to improve the dissemination and evaluation of scientific work. Frontiers in Computational Neuroscience, 8, 82.
Buttliere, B., & Wicherts, J. (2014). Opinions on the value of direct replication: A survey of 2,000 psychologists. PsyArxiv
Buttliere, B., & Buder, J. (2017). Personalizing papers using Altmetrics: Comparing paper ʻQuality ʼ or ʻImpactʼ to person ʻIntelligenceʼ or ʻPersonalityʼ. Scientometrics, 111(1), 219-239.
Buder, J., Zimmermann, A., Buttliere, B., Rabl, L., Vogel, M., & Huff, M. (2023). Online interaction turns the congeniality bias into an uncongeniality bias. Psychological Science, 34(10), 1055-1068.
Chen, Y., Farzan, R., Kraut, R., YeckehZaare, I., &Zhang, A. F. (2023). Motivating experts to contribute to digital public goods: A personalized field experiment on Wikipedia. Management Science, mnsc.2023.4852. https://doi.org/10.1287/mnsc.2023.4852
Impact. (2022, August 12). Wiki Education. https://wikiedu.org/impact/
Jemielniak, D., & Aibar, E. (2016). Bridging the gap between wikipedia and academia. Journal of the Association for Information Science and Technology, 67(7), 1773–1776. https://doi.org/10.1002/asi.23691
Kincaid, D. W., Beck, W. S., Brandt, J. E., Mars Brisbin, M., Farrell, K. J., Hondula, K. L., Larson, E. I., & Shogren, A. J. (2021). Wikipedia can help resolve information inequality in the aquatic sciences. Limnology and Oceanography Letters, 6(1), 18–23. https://doi.org/10.1002/lol2.10168
Konieczny, P. (2016). Teaching with Wikipedia in a 21 st-century classroom: Perceptions of Wikipedia and its educational benefits. Journal of the Association for Information Science and Technology, 67(7), 1523–1534. https://doi.org/10.1002/asi.23616
Kousha, K., & Thelwall, M. (2017). Are Wikipedia citations important evidence of the impact of scholarly articles and books? Journal of the Association for Information 10Science and Technology, 68(3), 762–779. https://doi.org/10.1002/asi.23694
Oravec, J. A. (2019). The “Dark Side” of Academics? Emerging Issues in the GamingandManipulation of Metrics in Higher Education. The Review of Higher Education, 42(3), 859–877. https://muse.jhu.edu/pub/1/article/720763
Poulter, M., & Sheppard, N. (2020). Wikimedia and universities: contributing to the global commonsintheAgeofDisinformation. Insights: the UKSG Journal, 33(1). https://eprints.whiterose.ac.uk/160126/
Taraborelli, D., Mietchen, D., Alevizou, P., & Gill, A.J. (2011). Expert participation on Wikipedia: barriers and opportunities [Conference presentation]. Wikimania 2011.
Haifa, Israel. https://upload.wikimedia.org/wikipedia/co mmons/4/4f/Expert_Participation_Survey__Wikimania_2011.pdf.
Vetter, M. A., McDowell, Z. J., & Stewart, M. (2019). From opportunities to outcomes: The Wikipedia-based writing assignment. Computers and Composition, 52, 53-64.
Kousha, K., & Thelwall, M. (2017). Are Wikipedia citations important evidence of the impact of scholarly articles and books? Journal of the Association for Information Science and Technology, 68(3), 762–779. https://doi.org/10.1002/asi.23694
Zagorova, O., Ulloa, R., Weller, K., & Flöck, F. (2022). “I updated the ”: The evolution of references in the English Wikipedia and the implications for altmetrics. Quantitative Science Studies, 3(1), 147–173. https://doi.org/10.1162/qss_a_00171