Grants:IEG/Reimagining Wikipedia Mentorship/Final


Welcome to this project's final report! This report shares the outcomes, impact and learnings from the Individual Engagement Grantee's 6-month project.

Part 1: The Project

edit

Summary

edit
 

In a few short sentences, give the main highlights of what happened with your project. Please include a few key outcomes or learnings from your project in bullet points, for readers who may not make it all the way through your report.

After evaluating the state of existing help spaces on the English Wikipedia, we designed and conducted a one-month pilot for a mentorship space called The Co-op, where new editors were matched with experienced mentors to help them achieve their editing goals. Here are some of the main successes of The Co-op:
  • Mentorship is associated with increased productivity and retention of newer editors. Editors who were mentored made at least three times as many edits on average compared to editors who were not mentored. 68% of mentored editors remained active for one month after the pilot ended, whereas only 22% of non-mentored editors remained active in the same period.
  • We drastically reduced the amount of time new editors wait for a mentor by several days. Median waiting time was about 12 hours in The Co-op. In 2011, editors had a median four day waiting period for a mentor using the Adopt-a-user program.
  • A surprising number of experienced editors participated in The Co-op despite not receiving an invitation. These editors also benefited from mentorship and were often engaged in more complex topics and questions.
We also faced some challenges in our pilot:
  • Only about half of the new editors in The Co-op actually engaged with their mentors; many editors simply dropped off after creating their profile. We therefore conclude that mentorship is better suited for editors who have some minimal but demonstrable degree of experience. This drop-off was also frustrating for mentors attempting to reach out and provide help.
  • We had 25 mentors respond to 49 editors seeking a mentor, and was a manageable load for the purposes of the pilot. In order for The Co-op to function more openly, it will require additional community support in the form of volunteer mentors to respond to requests, and ensuring that matching functions are working properly.
  • Qualitative feedback on a survey we prepared to assess participants' experience using the Co-op was positive, but unfortunately minimal.

Methods and activities

edit

What did you do in project?

Please list and describe the activities you've undertaken during this grant. Since you already told us about the setup and first 3 months of activities in your midpoint report, feel free to link back to those sections to give your readers the background, rather than repeating yourself here, and mostly focus on what's happened since your midpoint report in this section.

In this final report, due to the extensions for this project, we will present our activities from November of 2014 to April of 2015, in addition to referencing work beginning from June 2014.

Conceptual planning

edit
 
An early skill tree and prerequisites concept for the Co-op. This idea was simplified into general editing skill areas due to the tree's complexity and our prediction that it would be too complicated for newer editors.
  • We searched extensively through Wikipedia to build a mentor resources page for mentors could use for reference or teaching purposes, and developed a covering many topics. Categorization were based on this skill tree developed earlier in our project as an idea for structuring editing skills, but felt it was a better fit for the resources page. Eventually, this resources page may be able to be retooled for learners, but is currently aimed at mentors.
  • We determined color palette, logo, and page formatting through a series of discussions reviewing wireframes, mock-ups, and images. Priority was given to the front page in order to ensure that it felt inviting, easy to navigate, and made the purpose of the project clear from a new editor's standpoint.
  • We finalized the manner in which learners and mentors would be matched, which would primarily be on the basis of 1) the type of editing the learner wanted to do, and 2) the type of editing that mentors wished to teach about. These editing types were classified as:
    • Writing
    • Communication
    • Best practices
    • Technical editing
    • Images and media
    • Other
We developed brief descriptions and examples of these types of editing intended for learners. Unfortunately, we were unable to make these available to learners during the pilot period because profiles needed to be made manually using a pre-filled editing window.
  • We also determined a final layout for profiles, which included parameters for the editor's username, an image, a type of editing they wanted to learn about, and a space for them to describe themselves or their goal in more detail.
  • Using phabricator, we began managing individual tasks related to the Co-op, such as bugs or enhancements. See here for more details and current work.
  • We worked extensively with the Flow team to examine the possibility of implementing Flow into the mentor-learner communication structure, but both our team and the Bot Approvals Group expressed concerns about certain bugs that had not been resolved, particularly those related to deletion. We are still interested in testing out Flow when these bugs have been resolved.

Design and development

edit
 
An early set of wireframes detailing the general structure of the space and basic contents for the landing page of The Co-op.
 
A closer-to-final draft of the front page.
  • Graphic design elements, in addition to general layouts were implemented into the MediaWiki interface using appropriate wiki markup, HTML, and CSS code. Additional work was needed to simplify and ensure the code for profiles was appropriate for matching editors together.
  • An organizational scheme was developed such that specific components of Co-op pages would be transcluded onto a main page. For instance, the front page consists of:
a header,
the main interactive component,
a feed displaying existing mentors and learners,
and a navigation box.
  • In addition to the Navbox, we put together a site map for the Co-op for navigation. This map is designed to give anyone browsing the Co-op pages in the future for maintenance or future development to get a better idea of how things are laid out.

Testing and implementation

edit
  • Our team determined that the matching in space would be best accomplished using categories on learner/mentor profiles and a bot to detect a potential match and notify the relevant editors.
  • A basic model of The Co-op on test.wikipedia.org was developed in order to test out and debug the matching functions in addition to developing the structure of learner and mentor profiles.
  • Some debugging was needed based on the way Co-op templates were being used, like on the front page layout.

Engagement with learners

edit
  • We designed invitations for newer editors interested in mentorship. Many of them were sent via HostBot, but a small proportion were sent manually to newer editors identified using Snuggle.
  • Collectively, we mentored 24 editors during the one-month pilot period.
  • We developed a survey protocol for learners to assess their experience of the Co-op. Unfortunately, the available sample for surveying and the subsequent response rate for this survey was quite low.

Engagement with mentors

edit
  • We spent time recruiting mentors for our pilot and generally alerting editors to the space:
  • We posted a broad recruitment notice to WikiProject Editor Retention
  • A participant on the on the gendergap-l listserv provided some additional attention to our project in December 2014 ([1]).
  • We spent the most time personally inviting several editors who we had observed working with newer editors in the past and would be well-suited for mentorship. (e.g. [2], [3] [4])
 
A screenshot of the Co-op landing page during its pilot period.
  • In December 2014, We initiated a a pre-pilot discussion to involve mentors in thinking about how the space would function, well before the pilot started. Specific instructions were also written in preparation for the pilot.
  • We were responsive to ideas and questions from our volunteer mentors using the Co-op talk page to address comments, suggestions, and concerns such as:

Research and analysis

edit

Outcomes and impact

edit

Outcomes

edit

What are the results of your project?

Please discuss the outcomes of your experiments or pilot, telling us what you created or changed (organized, built, grew, etc) as a result of your project.

Data collection and participants

edit
 

The research portion of this grant set out to understand the experience of newcomers using help spaces on Wikipedia. To do this, we drew a sample of newcomers that had used Articles for Creation, Teahouse, or The Wikipedia Adventure. We also limited our sample to accounts that had at least 50 edits and been created after July 2014 (link to query used for sample: http://quarry.wmflabs.org/query/1142). Of the accounts that were returned in the sample, 290 were contacted based on criteria that included avoiding banned accounts as well as accounts that had not been active within the past month. Of the 290 newcomers that we reached out to, 46 completed the survey for a response rate of 15.8%. In addition to the survey, we conducted 10 semi-structured 1-hour interviews with newcomers that agreed to be interviewed after taking the survey.

The experimental portion of this grant set out to pilot a mentorship space, called The Co-op, intended to facilitate learning about contributing to Wikipedia and help interested editors accomplish their editing goals. We sampled editors through automated Co-op invitations to user talk pages using HostBot. HostBot identified new editors who had made at least 10 edits in a 24-hour period (the same criteria currently used for Teahouse invitations), and invited a sample of these editors. Roughly 20 invitations were sent out daily between 4 March 2015 to 14 March 2015, and then from 21 March 2015 to 15 April 2015. (This break in invitations was due to the trial expiry of this function of the bot from the Bot Approvals Group.) In total, approximately 980 invitations were sent. There were 49 learners, that is, participants at The Co-op, as defined by their creation of a profile necessary for mentorship. Interestingly, seven learners created a Co-op profile without an invitation.

The following write up will first review what our interviews and surveys revealed about the newcomer experience with not only the three targeted help spaces, but also other newcomer resources that came up in the data. The second portion will discuss the impact of mentorship through The Co-op.

Research on existing help spaces

edit
Usage
edit

Table 1 reflects the breakdown of help space use from the 46 survey responses. Respondents were allowed to select from multiple choices, giving us a sense of the range of help spaces that were being used. In addition to asking about the three help spaces that defined the sampling strategy, we gave respondents options to choose from a range of other help resources as well as point to the use of multiple help spaces and the sequence in which they were used (see Table 2).

Help Space Number of Respondents
Adopt-a-user 1
Teahouse 24
Reference Desk 6
Education program 2
IRC 8
The Wikipedia Adventure 26
Articles for Creation 11
Other 7
Multi-space sequence 13

Table 1. Breakdown of help space use

Responses included "Friends," "Village Pump," "Help Desk," and "users who left messages on my talk pages".


Multi space use sequence
Teahouse → Village Pump
Teahouse → The Wikipedia Adventure (x2)
Teahouse → The Wikipedia Adventure → Articles for Creation
The Wikipedia Adventure → Teahouse → Reference Desk → Talk page
The Wikipedia Adventure → Reference Desk → IRC
The Wikipedia Adventure → Articles for Creation (x2)
Talk message → Teahouse → The Wikipedia Adventure
Friends → Teahouse → The Wikipedia Adventure
Reference Desk → Teahouse

Table 2. Multi space use sequence

Finding help
edit

Through long form answers in the survey and interviews, we found that the majority of respondents were directed to help spaces by either a bot generated welcome message left on their talk page or directly by another user or friend, either via on or off-wiki interaction. In some cases, newcomers participating via an offline commitment like the Education Program or GLAM were directed by a project leader or professor to sign up for the Wikipedia Adventure so that they could get a basic understanding of how to contribute to Wikipedia. There were only a few mentions of newcomers stumbling across the help space. For example, one respondent noted that they found the Teahouse after they did a Google search on a problem they were trying to address.

Why help spaces are used
edit
 

One of the important questions that we want to understand going forward with further iterations of the Co-op is what challenges or obstacles drive newcomers to help spaces.

The survey revealed that 80% of respondents were driven by a general sense of uncertainty when seeking out support in the help spaces. As one newcomer noted “I was just lost at the beginning and didn't know where to start.” Another stated, “I had a general beginner's need for knowledge, no specific goals.”

This sentiment of feeling uncertain was supported by themes that emerged from the interviews where newcomers described a sense of being overwhelmed, in particular by the wide range of policies and how to interpret them as well as with the initial complexity of the syntax.

  • “Prior to TWA I found the syntax on the edit page to be very intimidating and was worried that I would mess things up.”
  • “Guidelines are easy to find but their interpretation or execution is not always a straightforward process.”

The above quotes do indicate that, layered within a general sense of uncertainty,are specific needs that newcomers seek to address in the help space. As the survey revealed, 78% of respondents had a sense of what they wanted to learn when they arrived at the help space. The most common theme that emerged regarding motivations for seeking help had to do with policy. In particular, there were a number of instances relating to questions of notability as well as copyright policy for image use. As the concerns around policy might imply, the data revealed that newcomers knowing what they want to learn came about after some initial exploration in making edits to Wikipedia. As such, what appear as seemingly contradictory answers between broad versus specific newcomer needs may point instead to newcomers being overwhelmed by specific facets of the Wikipedia experience.

Of note, 71% of respondents disagreed with the statement “I participated because I had a bad experience with other editors.” This is of course is encouraging given previous research that correlates the editor decline on Wikipedia to aggressive reverting by established editors against newcomers. This may merit further investigation.

Impact
edit

As the previous section might predict, newcomers indicated that the help spaces were particularly useful for understanding the basics of contributing to Wikipedia. 87% of survey respondents indicated that they found the answers they were looking for in the help spaces. 51% found editing to be easier, 37% were neutral about a change in difficulty, while 12% still found editing to be difficult.

The following responses from the interviews and long form answers in the survey provide insight on what newcomers learned:

  • I learned how to edit properly, basic [wiki-etiquitte], and how to add images
  • It helped me gain practice at basic editing.
  • How to put in pics and how to reference properly
  • How Talk pages work.
  • How to correctly interpret some guidelines
  • I learned to persevere and ask for help

The final bullet point in the above list addresses what might be described as an attitudinal benefit of newcomers using help spaces. For example, two respondents that had gone through The Wikipedia Adventure noted that they felt less intimidated by the code on the edit page and confident that they would not “mess things up.” Another respondent describes how having their article go through the Articles for Creation feedback process gave them the confidence that the community would accept what they wrote.

Beyond newcomers feeling empowered after participating in the help space, another prominent theme that emerged about the benefits of help spaces is that they consolidate the learning process. As one respondent pointed out, Wikipedia is not lacking in explanation on how it is to be used, however it is this abundance of resources that make it overwhelming for a newcomer.

one of my problems or complications I have with Wikipedia is there are multiple entries into it and multiple venues that teach you how to use it, but there is no standard protocol set up, there is no one common approach and it just reminded me of driving into a big city without a map.

Speaking to the above concern, newcomers expressed some satisfaction with the way help spaces gave them an introduction to the basics of Wikipedia. As one respondent points out, the Wikipedia Adventure (TWA), for example, provided them with “simple, plain instruction as to how to start out.” Another respondent described how TWA acts as “a good source for general orientation to editing Wikipedia.”

The theme of help spaces consolidating fundamental information for newcomers was echoed in answers to the question of whether or not newcomers would have been able to learn what they wanted had they not found the help space. As one respondent described:

Not really. Attempted to find the information I needed on my own but swamped by the amount of information out there. The help space enabled me to ask a specific question and get a quick specific answer without spending hours wading through everything.

Broadly, help spaces appear to accomplish two key services: First, they provide newcomers with foundational knowledge about contributing to Wikipedia and second, they simplify the information seeking experience by acting as a hub for general orientation information. In the following section, we examine some of the unique strengths of the various help spaces.

Unique strengths of help spaces
edit

The various help spaces we focused each appeared to provide their own unique value to the broader newcomer support ecology. We speak to each in turn.


 

Articles for Creation (AfC): As the space is intended to do, pointed and constructive feedback was described as being useful to newcomers learning how to write an article. This is perhaps why newcomers in our interviews described the AfC space as a valuable source for feedback that they were not able to obtain outside of AfC. This sentiment was supported by data from our survey in which 80% of respondents who did not participate in the Wikipedia Adventure indicated that finding help from other Wikipedians via the help spaces was easy. Building on the theme of feedback, one respondent described how the feedback they received on their article showed them that there was a community of people on Wikipedia that cared about the topic of their article.

I felt like there was a community there, not only to ask questions of but also to help guide me. That I think is what made me think this whole endeavor was a success…it was a real comfort to me.


 

Teahouse: One of the strengths that we observed in the data pointed to the value of the Teahouse for interpreting policies and guidelines. As one respondent noted, it is not hard to find guidelines and policies on Wikipedia, however it understanding the context of their application that is difficult. Finding contextual clarity is something that they found the Teahouse to be particularly useful for.

In an interview with a mentor from the Teahouse, we repeatedly heard how the informality of the Teahouse is what makes it so successful. The mentor noted that at the Reference Desk, people are expected to know the fundamentals, but not at the Teahouse. At the Teahouse, someone can show up without a clue on how to edit Wikipedia whereas you need to have a specific, well-articulated question when you use the Reference Desk. The perceived value of the Teahouse’s informality may help to explain why it is so useful for seeking contextual clarity around guidelines and policies.


 

The Wikipedia Adventure (TWA): Many of our interview subjects described TWA as a good introduction to the basics of contributing. Building on the theme of how help spaces simplify the newcomer information seeking experience, TWA was often described as a resource that helps make the process of learning how to navigate an intimidating and overwhelming environment more manageable. When asked what their experience would have been like without TWA, one newcomer replied,“I think initially it would have been a nightmare, it would not have been impossible, but I probably would have missed a lot of valuable information.”

Looking across the different spaces: One of the interesting findings is that the help spaces appear to provide contextual clarity to seemingly ambiguous or highly mutable guidelines of editing. This is worth noting as a valuable resource for newcomers in that many newcomers find themselves on the receiving end of feedback that does not provide the clarity they need for further participation.

Potential improvements
edit

While not a prominent theme, some concerns about the formality of AfC and TWA were brought up, noting that their structured approach resulted in overlooking more nuanced obstacles faced by newcomers. For example, one respondent pointed out that TWA presents an idealized experience of participation in Wikipedia and that some of what they learned did not apply to what they experienced.

“I think the downside of the Wikipedia Adventure…is that at least my experience with Wikipedia is that some of the things that they discuss don’t really work the way that I’ve experienced them that they’re supposed to work as described in the Wikipedia Adventure.”

Another newcomer using AfC described how the strict adherence to the correlation between notability and appropriate references made it difficult for her to get her articles accepted about women in science who, unlike their male counterparts of similar achievement, do not have sufficient references.

Using multiple help spaces
edit

Surveys and interviews provided insight into the way newcomers moved between the different help spaces. As described in the previous section about the strengths of help spaces, the Wikipedia Adventure was seen as a space where newcomers can lay a broad foundation for their experience as editors. From there, we observed newcomers moving to the Teahouse with the understanding that this was a more appropriate resource for pursuing specific and contextually unique questions about their editing experience. While table three does point to evidence that some newcomers started with the Teahouse before moving on to other help spaces, we do not have data that speaks to experience of newcomers who moved through this sequence and how this affected their learning. Unpacking the movement through these spaces and how the different sequences impact newcomer experience would benefit from further research and help us gain more insight into the growing newcomer support ecology and the different pathways to participation.

Beyond help spaces
edit

Our data revealed other resources outside of formal help space that newcomers were drawing on. The more traditional approach of “observing” others work was identified as a resource. For example, one newcomer described how they would look at articles that were of a similar topic to the article they were working on and copied and pasted syntax for the purposes of achieving appropriate formatting.

In some cases, interviews revealed newcomers benefiting from ongoing interaction with other Wikipedians. For example, we interviewed a newcomer who came to Wikipedia to work on a very specific topic related to her work. Early on she connected with an experienced editor who was also working on topics in her field who went on to become her mentor, providing her with feedback and directing her to resources like WikiProjects. In other examples we observed newcomers who found mentors through offline editing events like GLAM workshops.

How the above findings affected development of the Co-op
edit

The above findings helped us address important questions in developing the Co-op: How do we invite newcomers? How should mentorship be framed? What strengths of existing or previous help spaces can we draw upon?

In the case of engaging newcomers, our findings showed that the majority of respondents found out about their help spaces through invitations rather than stumbling across them. As such, our outreach focused on inviting newcomers with the support of HostBot. We also used Snuggle, a program that lists and provides editing history for new and active editors, to identify individuals who might benefit from mentorship. Drawing on the desired skills that were most popular among respondents, we ensured that such skills as formatting, social interaction, and guideline interpretation were choices with high visibility in the interface when newcomers signed up for the Co-op.

We also worked at incorporating the unique strengths of various help spaces into the design of the Co-op. For example, we incorporated the informality of the Teahouse by allowing space for newcomers to write out, to the best of their ability, the nature of the challenges that drew them to seek help. Where AfC provided targeted feedback, we gave newcomers the opportunity to seek out feedback based on specific aspects of their editing experience. Finally, the experience of newcomers with TWA pulled us towards providing Co-op participants with an inviting and friendly interface that guided the profile creation process and automatically matched their needs with appropriate mentor skills.

Findings from the Co-op pilot

edit
Initiating mentorships
edit

One important goal of this pilot was to ensure that we developed a process by which learners were matched with mentors successfully and promptly.

Explanation of Co-op matching system

First, it is important to describe how matches were formed. Matches were made through profiles created by both mentors and learners based on the type of editing the learner needed help with and the kind of editing mentors were willing to teach. These types fell into one of the following six categories (see this page for descriptions):

  • Writing
  • Best practices
  • Communication
  • Images and media
  • Technical editing
  • Other

Learner profiles also included a space to describe in more detail why they were looking for a mentor or what they sought to accomplish through mentorship. We thought this would help mentors be able to more meaningfully engage with learners based on their needs and goals.

Mentors also created similar profiles where they chose one or more of the above categories to teach about. They also were able to designate themselves as "unavailable," meaning they would not be matched to a learner. We also replaced the "Other" category with a category named "General editing." This category would allow the mentor to be matched based on any of the categories above if no one was available to mentor for that specific category. (e.g. If no mentors who selected "Images and media" were available, they could be matched with a mentor under "General editing.")

Mentor and learner profiles would then be matched together. Initially, these matches were done manually by notifying suitable mentors. Later, upon receiving approval from the Bot Approvals Group, starting 10 March to 14 April 2015, matches were made automatically using HostBot. HostBot would scan for unmatched profiles every five minutes by matching profiles based on the above categorical parameters.

We collected measures to evaluate the formation of mentorships. Specifically, we looked at whether four necessary steps took place for the 49 learners who made profiles in order for them to be considered mentored:

  1. Were learners were matched with a mentor?
  2. Did their mentor introduce themselves?
  3. Did learners respond to their mentor once they were contacted?
  4. Whether learners continued to edit during or after talking with their mentor.

Approximately half the learners (n=25) were not mentored because not all of the above steps were completed. In the majority of these cases (n=21), steps 1 and 2 would be completed, but not step 3 or 4, indicating that the learner did not interact with the mentor or did not edit after an initial reply to their mentor. In two other cases, matching categories were not chosen and could not be matched. In the two other cases, mentors did not initiate contact with the learner. In one of these cases, the mentor had sustained an injury that made it difficult to participate. In the other case, which was toward the end of the pilot period, the mentor became inactive without notification and a replacement mentor could not be identified in time.

The other half of our sample (n=24) did interact with their mentor and edited during mentorship. While this attrition in the project was surprising, the split does allow for useful comparisons between "mentored" and "non-mentored" groups as a means to evaluate the potential impact of mentorship. These comparisons will be evaluated further in the next section.

It is difficult to determine why so many learners did not follow-up with mentors once they were contacted. However, it is important to note that our sample consisted of very new editors. A few possible explanations for this attrition include:

  • Confusion around the use of and conventions of user talk pages,
  • That mentors were encouraged to contact editors on the talk page associated with the learner's Co-op profile page, which may have been difficult to find.
  • Failure of mentors to use notifications
  • That many new editors generally become inactive after making a small number of edits.

We also looked at the duration for the first two steps to take place from the time the learner profile was created, and totaled them. In this analysis, we removed two cases that were 3+ standard deviations from the average:

Measurement Profile Creation → Step 1 (Match) Step1 → Step 2 (Mentor Introduction) Total
Mean 6.30 (SD: 9.65) 12.23 (SD: 19.43) 18.36 (SD: 19.38)
Median 1.57 5.72 11.85

Table 3. Average and median durations for steps involved in initiating mentorship, in hours (with standard deviations)

Because there were six cases that reached step 1 that did not reach step 2 or were excluded, the total is not the sum of the two durations.

It took an average of 19 hours for learners to be contacted by their mentor after making their Co-op profile. Despite remove two outliers in our data, standard deviations continued to show high variability in our sample, so we have also reported median values. Median values show that it took only about 12 hours for mentorship to be fully initiated, and that the individual steps took considerably less time.

In many cases, delays could be attributed to errors during to either profile creation or the unavailability of a mentor. If a profile was created correctly, learners could be matched in less than five minutes (and some learners were). These errors had to be detected and fixed manually during the pilot. When a mentor was not available, there were some cases where we would need to seek out another mentor to fill in, which could often take some time.

In comparison, the median time for an editor to wait for a mentor through the adopt-a-user program was about four days in 2011.[1] The Co-op has effectively reduced the time an editor waits for a mentor by 84 hours— exactly 3.5 days. With the Co-op, a learner could reasonably expect to hear from a mentor within the same or next day and begin working together.

What happened during mentorship?
edit

Using profiles, editors coming into the Co-op were able to describe what they wanted to learn or accomplish through mentorship. By using this information in addition to evaluating interactions between learners and mentors thereafter, here were common topics for which learners requested a mentor:

  • Getting started with a new article or AfC draft
  • Resolving article problems (such as maintenance tags or deletion)
  • Understanding the concept of notability / distinguishing between generally reliable and unreliable sources
  • Avoiding copyright and promotional issues with text
  • Learning about wiki markup

More advanced editors delved into other areas, such as learning about DYK nominations, dealing with disruptive editors, and understanding style guidelines for specific article topics.

We noticed a few interesting interactions arise as well:

  • A learner and mentor, coincidentally working in the same area, made plans to meet together at conference and edit-a-thon event (see here)
  • A learner based in China, who had difficulty accessing sources, was able to obtain sources with help from their mentor to improve articles on classical music. (see here)
  • A learner and mentor had substantial discussion surrounding best practices on consensus-building, interacting with other editors, and editing in controversial areas. (see here)
Impact on learners
edit

A number of measures were collected on the productivity of editors after they made profiles at the Co-op: Edit counts after profile creation, unique article edits, articles and drafts created, and the number of reverted edits. In this analysis, we excluded cases where the learner was blocked or was clearly intending to edit in a manner against policy (n=5), cases where the learner was an experienced editor (n=5), and cases where a learner created a profile, but did not actually need a mentor. (n=2).

Values for these measures were characterized by high variability from learner to learner. For example, here is the distribution of edit counts we collected from learners after they made their profiles:

 
Figure 1. Histogram of edit counts for the overall sample from Co-op profile creation to 3 May 2015.


While most editors in the sample made between 0-70 edits, some made substantially more than that during the pilot period. Consequently, median values were also included in the analysis to account for this variability. Table 4 shows measures based on the sample overall based on their productivity from their profile creation in March to 3 May 2015.

Measure Edits after Co-op profile creation Articles edited Articles and drafts created Reverted edits
Mean 44.86 (67.86) 6.78 (17.03) 0.70 (1.02) 0.97 (1.95)
Median 16 1 0 0
Total 1660 251 26 36

Table 4. Measures of productivity from March to 3 May 2015 for the overall sample (with standard deviation)


When adding in the five experienced editors who sought mentorship, it is clear these particular editors were quite productive. Table 5 shows the same statistics including experienced editors:

Measure Edits after Co-op profile creation Articles edited Articles and drafts created Reverted edits
Mean 90.41 (215.71) 10.44 (22.84) 1.05 (2.32) 1.66 (2.32)
Median 23 2 0 0
Total 3707 428 43 68

Table 5. Measures of productivity from March to 3 May 2015 for the overall sample (with standard deviation), including experienced editors.


Edit counts doubled from 45 to about 90 edits on average, and the number of articles edited increased from 7 to about 10 articles on average. The total number of edits also sharply increased from 1660 to 3707. However, as these particular editors had been editing Wikipedia for some time already, it is less certain whether that this increased productivity was due to participation in the Co-op.

The overall impact paints a mixed picture. Using the median values, a learner would typically make about 16 edits after making their profile, and would edit just one article. However, as mentioned previously, there were important differences in learner engagement with mentors once they created a Co-op profile. Roughly half of the learners in our sample actually engaged with their mentors, while the other half did not. We refer to these groups as a mentored group and a non-mentored group. When evaluating these same criteria between our groups, a different pattern emerges:

Group Measure Edits after Co-op profile creation Articles edited Articles and drafts created Reverted edits
Mentored Mean 63.84 (63.00) 10.16 (23.16) 0.79 (0.85) 1.11 (2.02)
Non-mentored Mean 24.83 (68.70) 3.22 (4.47) 0.61 (1.20) 0.83 (1.92)
Mentored Median 35 2 1 0
Non-mentored Median 4.5 1 0 0

Table 6. Measures of productivity from March to 3 May 2015 between mentored and non-mentored groups (with standard deviation).


Here we see some impact of learner-mentor engagement. When looking at averages, learners who got the benefit of mentorship were much more productive in terms of edits counts (63 vs. 25) and in unique articles edited (10 vs. 3) during the pilot. Median values show the same benefit for edit counts, but not for unique articles edited. There did not appear to be differences between the groups when looking at the number of articles/drafts created and reverted edits.

We also compared our learners on whether they remained active on Wikipedia. We evaluated both learner groups on whether they made at least 5+ edits starting 1 April 2015. In the mentored group (n=19), 13 learners remained active, indicating a retention rate of 68%. In the non-mentored group (n=18), only 4 learners remained active, indicating that only 22% of those editors remained active.

When comparing the groups, mentorship appears to have had some benefits on newer editors. Editors who engaged with mentors are much more likely to remain active, contribute more in terms of edit counts, and may edit across a broader set of articles.

Progress towards stated goals

edit

Please use the below table to:

  1. List each of your original measures of success (your targets) from your project plan.
  2. List the actual outcome that was achieved.
  3. Explain how your outcome compares with the original target. Did you reach your targets? Why or why not?
Planned measure of success
(include numeric target, if applicable)
Actual result Explanation
Improving editor activity compared to existing help spaces See global metrics below for overall editor activity. Comparatively within our sample, editors who engaged with their mentors were more active and edited more compared to editors made profiles but did not engage with their mentor. Unfortunately, we were not able to obtain reasonable measures of activity for other help spaces with which to compare out data. However, comparisons within our sample suggest that mentorship did improve editor activity.
Improving editor retention compared to existing help spaces 68% of new editors who created Co-op profiles and engaged in mentorship remained active one month after the pilot ended. Only 22% of editors who made profiles but did not engage in mentorship remained active. Editor retention was quite high for editors who engaged in mentorship. In comparison, the pilot report for the Teahouse indicated that 33% of editors remained active two weeks after they started. Mentorship appears to encourage editors to stick around.
Attract 50 editors to participate in The Co-op as editors seeking a mentor (learners) We recruited 49 editors during the pilot period, and there are 69 learners at present. The Co-op did not close immediately after the pilot ended, and continued to accept new learners for a time. More or less, we met our goal here in terms of attracting an initial base of newer editors to try out mentorship through The Co-op.
Recruit 30 editors to participate in The Co-op as mentors We recruited 25 editors prior to the pilot period, and there are still 25 mentors at present. Slightly under, but 25 seemed to be enough participation to appropriately match the influx of learners during the pilot period.
Cut the time to find a mentor in half Median time for a mentor to make direct contact with a learner in the Co-op was about 12 hours. In 2011, median waiting times for for the Adopt-a-user program was about four days. We drastically reduced the amount of time new editors were waiting. Much of this could be attributed to the responsiveness of mentors and the use of the bot-driven matching system to alert mentors when matches were made.
High satisfaction ratings in follow-up surveys to learners "It's great. It's different than the IRC help channel— more personalized."

"The Co-op has the personal touch that helped me swim in a sea of information; I am very grateful."

"[The Co-op] was better, although I did learn many of the basics on The Wikipedia Adventure, some advanced stuff I found on co-op."

While feedback in our follow-up survey was limited, available feedback showed that learners had a positive experience. Some feedback suggested that the Co-op may be better suited for editors who have had a little engagement with Wikipedia already.
Higher qualitative satisfaction when directly comparing The Co-op and existing programs N/A Response rate from learners to a follow-up survey was too low to make meaningful comparisons.


Think back to your overall project goals. Do you feel you achieved your goals? Why or why not?

Some of our goals were very clearly met during the pilot. Editors seeking mentorship waited far less time for a mentor compared to other options available on en.wiki. Editors who interacted with their mentor were more productive compared to those who did not. We also met our goals for recruitment in terms of attracting editors interested in mentoring in addition to attracting newer editors to our help space. However, we were not able to fully capture what the experience of using our space from the learner perspective due to our relatively small sample and low response rate.

Global Metrics

edit

We are trying to understand the overall outcomes of the work being funded across all grantees. In addition to the measures of success for your specific program (in above section), please use the table below to let us know how your project contributed to the "Global Metrics." We know that not all projects will have results for each type of metric, so feel free to put "0" as often as necessary.

  1. Next to each metric, list the actual numerical outcome achieved through this project.
  2. Where necessary, explain the context behind your outcome. For example, if you were funded for a research project which resulted in 0 new images, your explanation might be "This project focused solely on participation and articles written/improved, the goal was not to collect images."

For more information and a sample, see Global Metrics.

Metric Achieved outcome Explanation
1. Number of active editors involved 52 Of the 49 learners who created a profile in the space during the pilot, 24 actually interacted with their mentor and made contributions after interacting with their mentor. Of those 24 editors, 18 remained active (i.e. 18 editors had 5+ edits since April 1st, roughly one month since the end of the pilot period). All 25 mentors remained active. Of the 12 organizers, our 3 contrators were not active editors through the project.
18 + 25 + 9 = 52
2. Number of new editors N/A The purpose of the Co-op was not directed toward recruiting new editors; only editors who already made accounts were involved.
3. Number of individuals involved 86 While several hundred editors were invited to the space over the one-month pilot, actual participants in the Co-op included 49 learners and 25 mentors according to totals from our lists of learners and mentors. (Note: Some entries from the list of mentors are prefilled forms or test profiles and do not represent actual individuals.) The organizing team consisted 12 individuals: 3 grantees, 3 contractors, 4 volunteers, and 2 advisors.
49 + 25 + 12 = 86
4. Number of new images/media added to Wikimedia articles/pages 76
5. Number of articles added or improved on Wikimedia projects 428 Of these 428 articles, 43 were newly created by Co-op learners during the one-month pilot period.
6. Absolute value of bytes added to or deleted from Wikimedia projects N/A As this was not a single event, but consisted of editors coming in at different periods during the one-month pilot, we could not obtain information about bytes added.


Learning question
Did your work increase the motivation of contributors, and how do you know?
  • Motivation was not something we were able to directly assess in this study, though we did attempt to do so. However, because we observed that many editors who engaged in mentorship also remained active beyond the pilot period, the Co-op space itself and the work of mentors were a factor in motivating newer editors.

Indicators of impact

edit

Do you see any indication that your project has had impact towards Wikimedia's strategic priorities? We've provided 3 options below for the strategic priorities that IEG projects are mostly likely to impact. Select one or more that you think are relevant and share any measures of success you have that point to this impact. You might also consider any other kinds of impact you had not anticipated when you planned this project.

Option A: How did you increase participation in one or more Wikimedia projects?

  • The Co-op facilitated increased participation on the English Wikipedia by helping newer editors reaching their goals. This was evidenced increased editing and higher retention of active editors compared to those who simply made profiles and did not engage in mentorship. We also noticed that even experienced editors were interested and benefited from mentorship is some cases, and often sought to learn about more complex or uncommon kinds of editing.

Project resources

edit

Please provide links to all public, online documents and other artifacts that you created during the course of this project. Examples include: meeting notes, participant lists, photos or graphics uploaded to Wikimedia Commons, template messages sent to participants, wiki pages, social media (Facebook groups, Twitter accounts), datasets, surveys, questionnaires, code repositories... If possible, include a brief summary with each link.


  • Writing

  • Communication

  • Best Practices

  • Technical

  • Images

Learning

edit

The best thing about trying something new is that you learn from it. We want to follow in your footsteps and learn along with you, and we want to know that you took enough risks in your project to have learned something really interesting! Think about what recommendations you have for others who may follow in your footsteps, and use the below sections to describe what worked and what didn’t.

What worked well

edit

What did you try that was successful and you'd recommend others do? To help spread successful strategies so that they can be of use to others in the movement, rather than writing lots of text here, we'd like you to share your finding in the form of a link to a learning pattern.

What didn’t work

edit

What did you try that you learned didn't work? What would you think about doing differently in the future? Please list these as short bullet points.

  • One difficulty in this project was trying to anticipate the expectations of new editors coming into our space. For instance, we framed mentorship to be focused on broad tasks in editing, whether it be writing, communication, or figuring out wiki-markup. Newer editors, however, would frequently fill out profiles with more topical interests (e.g. [5]). In reconsidering how profiles should be structured at the Co-op, adding a parameter to incorporate such topical interests may be useful.

Other recommendations

edit

If you have additional recommendations or reflections that don’t fit into the above sections, please list them here.

  • The Co-op needs to address common errors in profile creation if it is to be realistically maintained. A slick piece of software called FormWizard will help facilitate profile creation so as to avoid some of these errors in profile creation that prevent matching and make for extra maintenance work.
  • We hope to address the high attrition rate from the Co-op (i.e. editors who made profiles but did not engage in mentorship, roughly half of our sample), by adjusting who receives invitations. We expect to see better engagement if we choose to invite editors who are autoconfirmed and recently active, rather than inviting very new editors who happen to active when they initially make their accounts. We may also consider implementing a certain edit count requirement for editors so that we bring in editors who are likely to stick around.
  • One interesting observation coming out of our background research, and to some extent in our pilot, was that it is common for newer editors to use multiple help spaces. This may happen for several reasons, such as seeking an answer to the same question in multiple places, or because one space may be better suited to provide help in some areas (e.g. starting a new article), but not others (e.g. obtaining permissions from copyright holders). Some are also better suited for very new editors, whereas others are not. In any case, we feel that there needs to be a better system for directing new editors to an appropriate help space based on their needs and experience.

Next steps and opportunities

edit

Are there opportunities for future growth of this project, or new areas you have uncovered in the course of this grant that could be fruitful for more exploration (either by yourself, or others)? What ideas or suggestions do you have for future projects based on the work you’ve completed? Please list these as short bullet points.

  • We are seeking support from the editing community in the form of mentors to help continue the success of The Co-op. To this end, we will be summarizing our study on some WikiProjects, the Signpost, and to the WMF blog. We also welcome community feedback on how to improve how the Co-op works (e.g. What changes could be made to improve our matching parameters?)
  • We are in the process of implementing FormWizard in profile creation to reduce errors and continue to reduce the amount of waiting time for learners to begin mentorship.
  • Invitations were previously sent to editors with new accounts who had at least 10 edits within the past 24 hours. Because of the high rate of attrition in the study from very new editors, we are instead inviting editors who are autoconfirmed and have recent activity with the expectations that more editors will participate in mentorship.
  • Implementation of Flow to facilitate learner-mentor interactions was discussed during this project, but concerns surrounding certain bugs related to deletion made integration of it impractical for our pilot. However, many of these bugs appear to be resolved, and we look forward to testing out the Flow software in our space soon.
  • Jethro will continue to mentor and oversee the above plans, but will no longer be overseeing The Co-op day-to-day. He is willing to help facilitate discussions about the conventions of the Co-op (e.g. changing how editors are matched, who we invite, or how we handle absent mentors). Soni is also willing to help as a mentor and help improve the Co-op more generally. Gabe and Jethro have also discussed a few possible conventions or publications with which to share our findings.
Think your project needs renewed funding for another 6 months?




Part 2: The Grant

edit

Finances

edit

Actual spending

edit

Please copy and paste the completed table from your project finances page. Check that you’ve listed the actual expenditures compared with what was originally planned. If there are differences between the planned and actual use of funds, please use the column provided to explain them.

Expense Approved amount Actual funds spent Difference
Programmer $3000 $600 $2400
Graphic designer $4000 $4335 $-335
Program development (Soni) $5200 $5200 $0
Project manager (I JethroBT) $7800 $9865 $2065
Researcher (Gabrielm199) $2600 $2600 $0
Community engagement & PM (extension period) $3560 $3560 $0
Total $26160 $26160 $0


Remaining funds

edit

Do you have any unspent funds from the grant?

Please answer yes or no. If yes, list the amount you did not use and explain why.

  • No.

If you have unspent funds, they must be returned to WMF. Please see the instructions for returning unspent funds and indicate here if this is still in progress, or if this is already completed:

Documentation

edit

Did you send documentation of all expenses paid with grant funds to grantsadmin wikimedia.org, according to the guidelines here?

Please answer yes or no. If no, include an explanation.

  • Yes.

Confirmation of project status

edit

Did you comply with the requirements specified by WMF in the grant agreement?

Please answer yes or no.

  • Yes.

Is your project completed?

Please answer yes or no.

  • Yes.

Grantee reflection

edit

We’d love to hear any thoughts you have on what this project has meant to you, or how the experience of being an IEGrantee has gone overall. Is there something that surprised you, or that you particularly enjoyed, or that you’ll do differently going forward as a result of the IEG experience? Please share it here!

  • Speaking as the project manager of The Co-op, this has been both a rewarding and challenging experience. Prior to this project, I had never lead a project of this scope before, let alone worked so closely with graphic designers, programmers, researchers, and coordinators. IEGs are intended to last six months, but it was fair to say that we knew from the start that piloting a mentorship space would take much longer than that to be realized. We had a lot of early success doing our research, surveying and interviewing editors on the help landscape on en.wiki, and making thematic and practical plans for the layout of The Co-op.
However, the day-to-day work became more challenging in terms of physically building and maintaining the space once the pilot began. I want to make very clear that without the technical help from many key individuals from the WMF, namely Jonathan Morgan, Frances Hocutt, and Aaron Halfaker, this project would not have gotten off the ground, and I have immense appreciation for their generosity and skill. Checking the Co-op constantly throughout our pilot period was particularly draining; we had planned to have automated tools to assist with maintenance, but much of that ended up falling into my duties. Some of those tasks included fixing broken profiles, manually finding matches, reminding mentors, and making updates about problems and changes to The Co-op. I am convinced that such a responsibility needs to be better split in the future-- it was too much for me to take on. One of the most frustrating aspects for me and many mentors was that we often found ourselves being matched with editors who dropped off or simply were not there to edit productively. Mentorship can be positive, but it can feel like time wasted when an editor fails to respond or ends up getting blocked. I appreciate being able to see mentorship from this perspective, however, and I am grateful to have had this opportunity to examine what mentorship for newer editors actually looks like. I JethroBT (talk) 09:39, 24 May 2015 (UTC)


  1. Musicant, D. R., Ren, Y., Johnson, J. A., & Riedl, J. (2011, October). Mentoring in Wikipedia: a clash of cultures. In Proceedings of the 7th International Symposium on Wikis and Open Collaboration (pp. 173-182). ACM. http://www.tc.umn.edu/~chingren/pdf/papers/wikisym-2011-Mentoring.pdf