Community Insights/Community Insights 2020 Report/Survey Methodology
How was this data collected?
editThe Community Insights survey was conducted in September and October of 2019 through Qualtrics. Most community members were selected to take the survey using a stratified random sampling strategy, with the goal of collecting a sufficient number of responses among different Wikimedia projects and levels of editing activity to identify experiential differences between groups in a statistically valid way. Contributors sampled by activity level and project received a link on their Talk Page to take the survey. Movement organizers were sampled randomly from Foundation grantee contacts, affiliate contacts, and organizer dashboard users, and were contacted to take the survey via email. A convenience sample was used to target volunteer developers, who followed a publicly posted link in spaces where they frequently collaborate, such as MediaWiki and Discourse.
How many people responded, and how many completed the survey?
edit14,965 contributors were randomly selected from across Wikimedia projects. 5739 of them (38%) began the survey, and 2207 (15%) finished at least half of the survey. 896 Movement organizers were sampled from leadership dashboard users, grantees, and affiliate representatives. 287 (32%) of them began the survey, and 215 (24%) finished at least half of the survey. Volunteer developers were targeted with a convenience sample, so there was no original sample size. 984 developers began the survey, and 91 completed at least half of the survey.
In all, 7010 people opened the survey and responded to at least one question, 2589 completed at least half the questions, and 2483 completed the entire survey.
How was this data analyzed?
editUnless otherwise specified, all frequencies and statistical tests are weighted by editing activity level using an inverse propensity score method. The propensity score used was the likelihood that each activity-strata of editors would complete at least 50 percent of the Community Insights survey. Effectively, weighting in this way means that those who were less likely to take the survey (this tended to be less active editors) carry slightly heavier weight in our analyses, so that we can report results that do not overrepresent the experiences of highly active editors, who are more likely to take the survey. In the interest of equitably representing Wikimedians--especially less-active editors who are less likely to frequent spaces where we tend to collect data--we have not excluded those completing less than 50 percent from the analyses in this report, and have applied the same weights to their responses as those established for users who completed at least half the survey. Where including this group may have influenced analytic results, details are noted in relevant endnotes in Methodological Appendix B. Analyses conducted without weights are performed for (1) movement organizers and volunteer developers, as editing activity is unknown for these contributors and we are unable to calculate activity-level strata, and (2) comparisons of editors at different activity levels, as this variable was used to assign the propensity score and weighting has no analytic effect. Comparisons of movement organizers with other contributors are weighted by activity level unless otherwise specified.
In this report, any differences between groups reported are only those that reached a level of statistical significance—meaning that the difference is big enough, and/or seen among enough respondents, to lead us to conclude that a real population difference has likely been detected, rather than statistical “noise.” There are groups for whom we detected no differences, but this does not always mean that differences do not exist. At times, we did not have a large enough subsample to detect those differences. This is why geographic differences are often reported at the regional/continental level—we acknowledge that continents are diverse and complicated spaces, but we often lack the statistical power—usually due to sample size—to detect country- or even sub-continent-level differences.
Countries and territories are categorized into continents according to the United Nations Statistics Division, with the exception of the Americas which are separated into the UN’s two sub-regional classifications, Northern America and Latin America/Caribbean. For more about which countries make up the regional and subregional classifications used in this report, see the UN's Geographic Regions.
Newcomers are defined as people who began contributing to Wikimedia projects in 2018 or 2019.
Youth are defined as people 18-25 years old. Due to the various legal protections of minors in research participation, we did not attempt to survey people under the age of 18.
What are the limitations of the Community Insights survey?
editEach year, we work towards making the data from Community Insights more representative of Wikimedia communities worldwide. This year, the survey was written in English and translated into 12 other languages: Standard Chinese, Latin American Spanish, Portuguese, Russian, Japanese, French, German, Italian, Arabic, Polish, Ukrainian, and Dutch. While these languages reach contributors to some of the largest Wikipedias, they fail to reach many contributors (especially in Southern and South-eastern Asia) without high proficiency in the 13 available languages. In the coming years of the Community Insights survey, we intend to expand the available translations to better reach our diverse community of contributors with the Community Insights survey.
Additionally, our measures of movement diversity have historically focused on genders and nationalities, and not on intra-national diversities of race and ethnicity. Including survey measures of race and ethnicity that are locally relevant takes research, time, and care, and we are working to begin inclusion of such measures in the 2020 version of the survey.