Wikimedia Blog/Drafts/New survey creates path for hearing from diverse Wikimedia communities

Title ideas edit

Ideally three to ten words, the headline to your piece will show up in social media shares and on the blog's homepage. Try to capture the most interesting part of your piece.
  • New survey creates path for hearing from diverse Wikimedia communities
  • Outcomes from Wikimedia Contributors and Communities Survey
  • Listening to diverse community voices
  • Engaging communities in new ways
  • Community Engagement Insights: A pathway to hearing diverse Wikimedia voices


Summary edit

A brief summary of the post's content, about 20-50 words. On the blog, the summary will appear in italicized text underneath the headline. You can use this space as a teaser, expansion of your headline, or as a summary of the post.
  • Last year, we experimented with Community Engagement Insights, a comprehensive project to hear from Wikimedia contributors, and gather input to design Wikimedia Foundation programs and support. We are presenting the findings next Tuesday, October 10, at 9 AM PST. Join us to learn more about the report and how teams will be using in their annual planning!

Body edit

 

Imagine a collection of surveys designed to hear from Wikimedia contributors, from every project and affiliate, which is used to influence the Wikimedia Foundation's program strategies. That is what Community Engagement Insights is about.

In 2016, the Wikimedia Foundation initiated a new project, called Community Engagement Insights, under which we designed the Wikimedia Contributors and Communities Survey, that aims to improve the alignment between the Wikimedia Foundation and the communities it serves. In this project, Foundation staff designed hundreds of questions that were organized into a single, comprehensive, online survey. The Foundation sent the survey to many different types of Wikimedians, including editors, affiliates, program leaders, and technical contributors. After completing the basic analysis, we now want to share what we learned, and how we are going to use the information learned. On Tuesday, October 10, at 10 am PST, we will hold a public meeting, where we will present some of the data we found and how different teams will use it, offer guidance on how to navigate the report, and open the space for questions. Join in the conversation! You can watch the livestream here, and ask question via IRC on #wikimedia-office.

How is this survey different from others? edit

Looking at the history of surveys at the Foundation, we did not have a systematic approach to hearing input and feedback from communities about the Foundation’s programs and support. For example, between 2011 and 2015, there were only 4 comprehensive contributors surveys: 3 Wikipedia Editor surveys, and 1 Global South User Survey. Between 2015 and 2016, however, we witnessed a growing demand for community survey data. In that year alone, there were 10 different surveys requested by 10 different teams. While the first four surveys were exploratory, to learn from users about broad themes, newer surveys were looking for more specific feedback on projects and initiatives. However, individual teams didn’t have a structure to approach this type of inquiry for our international audiences; nor was there a system in place for the foundation to hear from the communities it serves on a regular basis.

This is when we started to think about a collaborative approach to surveying communities. At the beginning of the Community Engagement Insights project, we interviewed teams to learn about the specific audience groups they worked with, and what information they wanted to learn from them. Understanding that the need for community survey data would continue to grow, we started thinking of a systematic and shared solution that could respond to support this emergent demand. The Wikimedia Contributors and Communities Survey was the answer. It has three key characteristics:

  • It is an annual iteration: the Wikimedia Contributors and Communities Survey is held year over year to observe change over time.
  • There is a submission process: participating teams can participate, and submit survey questions they would like answered.
  • It is a collaborative effort: survey expertise is spread across the organization, so people from different teams, especially those who have submitted questions, collaborate also not only on survey design, sampling and analysis, but also outreach, messaging, translation and communication.

Towards the end of the survey design process, 13 teams had submitted 260 questions, aimed at 4 audience groups: editors or contributors, affiliates, program leaders, and developers (also known as technical contributors).

What did we learn about Wikimedia communities and how we serve them? edit

 
Male to female ratios. Graphic from CE Insights 2016-17 report.
 
Regions editor respondents reported they come from. Graphic from CE Insights 2016-17 report.

In terms of response rates, we had 26% (4,100) response rate from editors, 53% (127) response rate from affiliates, and 46% (241) response rate from program leaders. Volunteer developers were not sampled, and we got 129 responses from that audience group.

For Wikimedia Foundation programs that are community-facing, we collected data on three different areas. Personal information allowed us to understand the personal characteristics of the communities we serve, such as gender and demographics. Here, we noticed that while the percentage of women contributors is still below 15% across regions, the number is higher when it comes to leadership roles: 25% of program leaders are women, as well as 28% of affiliate representatives. The majority of editors across all projects come from Western Europe (44%), followed by Eastern Europe (15%), South America (11%), and Western Asia (9%). [1]

The Wikimedia Contributors and Communities Survey also had questions about Wikimedia environments. Environments are the spaces that we are trying to make an impact on as a movement, such as the Wikimedia projects, Wikimedia software, or Wikimedia affiliates. Looking at Wikimedia projects environment, we learned that 31% of all survey participants (which includes editors, volunteer developers, program leaders, and affiliates) have ever felt uncomfortable or unsafe in Wikimedia spaces online or offline. Also, 39% agree or strongly agree that people have a difficult time understanding and empathizing with others. When rating issues on Wikipedia, the top three were vandalism, the difficulty in gaining consensus on changes, and too much work that goes undone. While 72% of editors claim to be satisfied or very satisfied with the software they use to contribute, 20% reported being neither satisfied nor dissatisfied.

 
Activities in the last 12 months. Graphic from CE Insights 2016-17 report.
 
Percent of participants who began contributing from 2001-2016. Graphic from CE Insights 2016-17 report.

Wikimedia Foundation programs is the third area we explored. Programs include the Annual Plan programs that are aiming to achieve a certain goal, and also other regular workflows such as improving collaboration and communication. Focusing on this goal, the Support and Safety team at the Foundation offers services to support Wikimedians. They learned that only an average of 22% of editors across regions had engaged with staff or board of the Foundation. Regionally, the Middle East and Africa represented the highest rate of engagement, at 37%.

These are only very few highlights. The full report has hundreds more data points in all three areas, about community health, software development, fundraising, capacity development, brand awareness, and collaboration, among other topics and programs.

Get involved! edit

Since each team had specific questions, that attended to particular initiatives and projects, now each group is in the process of analyzing the data they received. The end goal is to use this information on for data-driven direction and annual planning. Some teams will be sharing their takeaways in a meeting next Tuesday, October 10, at 10 am PST. During this meeting, we will also be presenting some of the data we found, offer guidance on how to navigate the report, and open the space for questions. Join in the conversation! You can watch the livestream here, and ask question via IRC on #wikimedia-office.

We will continue to work with community organizers that can help spread the word about the survey, and engage more community members in taking the survey next year. If you are interested in joining this project, please email eval@wikimedia.org.


María Cruz, Communications and Outreach manager, Community Engagement Edward Galvez, Survey specialist, Community Engagement

References
  1. These numbers are heavily influenced by the sampling strategy and do not necessarily represent the population of editors.