Across the nearly 540,000 media uploaded to Commons during the Wiki Loves Monuments events examined, over 5,000 image ratings were awarded to Wiki Loves Monuments photos as Quality, Valued and/or Featured Images
By late April 2015, about 18 months after most 2013 contests ended, and six months after most of the 2014 events had, 47 (65%) of events had «Quality Images», 25 (35%) had «Valued Images» , and 35 (49%) had «Featured Pictures»associated with them.
Of the 539,875 uploads examined, there were more than 5,000 ratings assigned
Number of rated photos per contest ranged widely and included:
Although those pictures rated as Quality Images, Valued Images or Featured Pictures are associated with several media uploaded for the contests, there was a wide range of rating counts per contest, which suggests that contests may have varying practices for rating and/or potential varying success in achieving these ratings.
↑Counts are inclusive such that an image that is rated both Quality and Featured is counted once in each category.
↑ Some photos may have been rated in multiple categories.
↑Number of rated photos per contest ranged from 0 to 1,419; Median= 5; Mean=59; SD= 184
↑Number of rated photos per contest ranged from 0 to 546. Median= 0 images per contest; Mean=11; SD=65
↑Number of rated photos per contest ranged from 0 to 27. Median= 0 images per contest; Mean=3; SD=5
For each contest, we assessed user retention rates for up to three possible follow-up periods
during the third, sixth, and twelfth month following the event start date.
A few things to remember with this section:
An active user is considered one that makes 5 or more edits during the follow-up period. We examined editing activity both in terms of the number of edits made to any Wikimedia project and in terms of the number of edits made only on Commons.
A survived user is defined as one that made at least one edit during the follow-up period. Again, we looked at editing activity for both any project and specifically on Commons.
As outlined in the table below, for events which had reached, or passed, the 12 month follow-up point, we assessed retention at 3, 6, and 12 months following the event start date. For more recent events retention was assessed out to the furthest point possible to the three and/or six months follow-up point.
As illustrated by the green (new users) and blue (existing users) shaded columns in the graph below:
at least 11,429 new and 6,122 existing users had met or passed the 3-month follow-up period
6,232 new and 3,289 existing users had met or passed the 6-month follow-up period; and
5,430 new and 2,885 existing participants had met or passed the 12-month follow-up period.
The dots illustrated in each of the green and blue columns in the chart below represent the number of editors who were active (black dots) or survived (white dots) to each follow-up point.
For new users retention rates ranged from 0.8% to 1.5% survived new users and demonstrated proportionately lower retention at follow-up points further out from the contest (i.e., 1.5% in Month 3; 1.0% in Month 6; 0.8% in Month 12).
For existing users retention rates ranged from 34% to 36% survived and demonstrated an increased in proportions of existing users retained at the 6-month point followed by a decreased proportion at the 12-month point (i.e., 33.5% in Month 3; 35.8% in Month 6; 35.4% in Month 12).
6,122 existing users participated in the Wiki Loves Monuments contests examined in this report. Not surprisingly, existing editor retention is higher than new user retention for Wiki Loves Monuments. At the three month follow-up stage, 34% of existing users «survived» and 28% were «active» editors.
Three months after their event started 1,743 existing users (28.5%) were retained as active editors and an additional 309 (5.04%) as surviving editors.
Six months after their event started, 980 existing users (29.8%) were retained as active editors and an additional 198 (6%) as ‘’surviving’’ editors.
Twelve months after their event started, 877 existing users (30.4%) were retained as active editors and an additional 145 (5.0%) as surviving editors.
↑3 Month follow-up window refers to period beginning two months following the event start date and ending three months after the event start date.
↑6 Month follow-up window refers to period beginning five months following the event start date and ending six months after the event start date.
↑12 Month follow up window refers to period beginning eleven months following the event start date and ending twelve months after the event start date.
↑New users were defined as usernames which had used Commons for the first time up to two weeks before the contest start date.
↑74% of the new active editors were editing on Commons, the targeted project, at three-month follow-up.
↑65% of the new active editors were editing on Commons, the targeted project, at six-month follow-up.
↑75% of the new active editors were editing on Commons, the targeted project, at twelve-month follow-up.
↑72% of the existing active editors were editing on Commons, the targeted project, at three-month follow-up.
↑73% of the existing active editors were editing on Commons, the targeted project, at six-month follow-up.
↑72% of the existing ‘’active’’ editors were editing on Commons, the targeted project, at twelve-month follow-up.
Wiki Loves Monuments program leaders are proactive at producing materials, mainly blogs and other online resources, related to their events which can help others implement their own contest.
We asked program leaders to share their practices around program replication and the types of shared learning resources produced that are related to their events. This allows us to learn if the reporting program leaders considered themselves experienced in implementing contests and to help others design and implement Wiki Loves Monuments of their own. We also are able to learn how program leaders and others (i.e. chapters, press, bloggers, etc) were covering the events, and if resources were made available for others to use to produce their own events.
For the 30 events with reports by program leaders on replication strengths, we learned that 27 (90%) felt they are experienced at producing photo contest events and could help others conduct their own. 26 (87%) reported having blogs or other online information available for others to learn more about the event. A smaller amount of program leaders reported that they developed brochures and/or printed materials (13; 43%) and guides or instructions on how to contribute to Wikipedia for event participants (9; 30%) (see graph below, Replication and shared learning).
Further investigation is needed to see how these resources are shared among program leaders in order to share and learn from one another's experiences and practices.