On this page, you will find answers to:
What are some long-term outcomes of edit-a-thons? What are some relationships between budget and participants or hours and content? How many users are retained?
A total of 5,727 articles were created or improved as part of the 97 edit-a-thons that reported, and 674 (12%) of these were articles created. The average edit-a-thon resulted in 7 articles created or improved.
From the 97 edit-a-thons reporting this metric (78%), the range was 0 to 983 articles created or improved, where the average event created or improved 7 new articles.[2]
Articles created were the most common type of content added (674 articles in 87 edit-a-thons), followed by photos and media added (639 media in 17 edit-a-thons).
Articles created and improved were the most frequent type of content contribution, and was reported for 97 edit-a-thons. One of these edit-a-thons even reported a featured article. In addition, 639 media were added during 17 edit-a-thons that reported, and 48 of those media were used on article pages. While most metrics of contribution for edit-a-thons focus on amount of text added, additional data on these other measures would paint a more complete picture of contributions at edit-a-thons.
From the 87 edit-a-thons reporting (72%), 3,362 pages of text were added and 437 pages were removed across more than 5,700 articles created or improved as part of the the events. The average edit-a-thon resulted in 13 pages of new text.
Text pages are a way to frame how much content is added to articles. For this report, one page of text was defined as 1,500 characters in any language.[3] From the 87 edit-a-thons reporting bytes added, the average event resulted in about 12.5 total text pages.[4] By counting pages of text added and removed in the edit-a-thons, we see that a total of 3,361.7 pages of text were added, while 436.9 pages of text were removed. The average number of pages added was 12.6[5] The average number of pages removed was 0.6.[6]
The typical cost per article created or improved was $0.00 USD. The cost per article created or improved was less than $17.40 for twelve (75%) of the sixteen edit-a-thons reporting data, but there are so few budgets reported that the context of the implementation is very important when interpreting the numbers and should not be generalized.
Data was obtained on the cost per article created or improved for sixteen (13%) of the 121 edit-a-thons included. The cost per article created or improved has a wide range $0.00 USD to $160.00, but a narrow distribution, where 75% of the reported cost per article created or improved is below $17.40 USD. For the typical edit-a-thon, the cost per article created or improved is $0.00 USD.[7]
The typical cost is $0.00 USD per text page and the cost per text page added or removed is less than $12.13 for the 14 edit-a-thons for which we have data.
We obtained data on the cost per text page added or removed for 14 (12%) of the 121 edit-a-thons. The cost per text page has a wider range but narrow distribution, where the cost per text page for any event is less than $12.13 USD but the cost per text page for 75% of the edit-a-thons is less than $3.91 USD. For the typical edit-a-thon, the cost per text page is $0.00 USD.[8]
Only 14 edit-a-thons reported information for budget, number of participants, and text pages added or removed (12%). Most edit-a-thons reporting a budget included 11 to 23 participants and there was no relationship between budget and text pages.
Only 14 edit-a-thons reported information for budget, number of participants, and text pages added or removed (12%). Most edit-a-thons reporting a budget included 11 to 23 participants and there was no relationship between budget and text pages. Budget per week does not show a significant relationship with number of participants nor with text pages.[9] Similarly, participants does not show a significant relationship with text pages.[10]
No relationship was found between the number of organizer hours, participation, and text pages added or removed.
Looking at hours invested and numbers of participants or amount of content, only sixteen edit-a-thons had reported both hours and numbers of participants, and only seven reported both hours and text pages added or removed. There were too few data points with too much variation to meaningfully interpret any relationships between hours input and participation or content.
Participation, articles created or improved, and text pages
We found a positive relationship between numbers of participants and articles created or improved, but no relationship between numbers of participants and amount of text pages.
For the 95 events (79%) which reported both participants and articles created or improved, the number of articles created or improved tend to increase when the number of participants increases.[11] However, for the 86 events that reported participants and text pages added or removed (12%), we find no relationship between the number of participants and the number of text pages created.[12]
Six-month retention data was available for 80 edit-a-thon events. Of 354 new users involved in these events, 19 (5%) were retained as active editors 6 months after their edit-a-thon. Of the 853 existing editors involved, 433 (51%) were retained as active users 6 months after the edit-a-thon in which they participated.
A few things to remember with this section:
A follow-up period is a 30-day window some time after the event start date. In the case of edit-a-thon retention, one-month and and six-month follow-up windows were assessed, meaning user activity was mined for first and sixth months after the event start date, respectively.
An ‘’active’’ editor is considered a user that makes 5 or more edits during the follow-up period. We examined editing activity both in terms of the number of edits made to any Wikimedia project and in terms of the number of edits made only to the project that was the main focus of the event.
A ‘’survived’’ editor is defined as a user that made at least one edit during the follow-up period. Again, we looked at editing activity for both any project and specifically to the event’s main project.
For the one-month window, data was available on 85 events. There were a total of 354 new users and 890 existing users involved in the eligible events. For the six-month window, data was available for 80 events. There were a total of 354 new users and 853 existing users involved in the eligible events.
The dots illustrated in each of the green and blue columns in the chart below represent the number of editors who were active at (black dots) or survived to (white dots) at each follow-up point.
At the one month follow-up window, 566 existing users (64%) were retained as active editors and an additional 102 (11%) were retained as surviving editors.
At the six-month follow-up window, 433 existing users (51%) were retained as active editors and an additional 44 (5%) were retained as surviving editors.
All edit-a-thons that reported on shared learning (38 events) were run by experienced program leaders that are willing to help others with similar programs. Blogs and other online resources were used to spread information in 37% of cases. Every event created some sort of resource for sharing knowledge about edit-a-thons.
Information on replication and shared learning was available for 31% of the edit-a-thons included in this report (38 events). Of these, 100% were run by experienced program leaders. Furthermore, 37% (14) reported using blogs or other online information to tell others about the event; 24% (9) created brochures or other printed materials to tell others about the event; and 13% (5) generated guides or instructions on how to run a similar event. The graph below compares these different percentages across edit-a-thons.
↑Although we list content production as an outcome, it can be an output or outcome, depending on the logic model of the contest organizer.
↑Text page is a metric used in a previous report for the Wikipedia Education Program and is one we use here as an intuitive way to illustrate the amount of content produced during edit-a-thons. We are able to obtain the number of bytes added to or removed from an article and then convert that to characters added or removed. One byte does not equal one character in all languages. For example, while Latin and Cyrillic characters are about 1 byte per character, Arabic or Armenian characters are about 2 bytes per character. We count the number of characters to determine the number of text pages. In English, about 1,500 characters equals one Letter size page with double-spaced text, including any wiki markup.
↑Correlation coefficient budget to participants= -0.07, ‘’p-value’’=0.76; ‘’Correlation coefficient’’ budget to text pages added or removed= 0.30, ‘’p-value’’=0.29.
↑Correlation coefficient participants to text pages= -0.05, ‘’p-value’’=0.87.
↑1 Month follow-up window refers to period one calendar month past the start date, and counting back 30 days. This period often includes the contest period.
↑6 Month follow up window refers to period beginning five months following the event start date and ending twelve months after the event start date.