What did we learn about Wiki Loves Monuments in terms of impact? We examined the results in terms of the stated priority goals program leaders shared in their reporting as well as outlined critical next steps toward program learning. Read this page to learn the most important takeaways and recommended next steps.
How does the program deliver against stated goals? edit
The four most commonly cited priority goals for 2013 and 2014 Wiki Loves Monuments events were: to increase contributions; to build and engage community; to increase awareness of Wikimedia projects; and to increase diversity of participants.
Data available on the total number of files uploaded through the Wiki Loves Monuments contests (rather than just those captured in this report) shows
Total number of images uploaded to Wikimedia Commons through Wiki Loves Monuments
Media uploaded for Wiki Loves Monuments captured in this reporting represents 11.4% of all media uploaded to Commons by registered users during the reporting period. Media uploaded for these events has been used in articles at nearly five times the rate of commons uploads overall during the same time period.
In terms of building and engaging community, participation in the examined events included more than 17,000 Wikimedia Users (Nearly 1,400 existing Active Users and 11,000 New Users). Of the new users generated through the examined contests 1.5% survived as editors on Wikimedia at three-month follow-up.
The majority of contest participants are newly registered users to Wikimedia. In addition to the reach of the event itself, nearly 90 percent of program leaders reported that they had developed blogs and other informative online documentation of their events. Promotional reach and potential learning about Wikimedia project is not captured by the data captured in this report, however, for 2013, a community-led survey was collected from contest participants, which included some items about how participants learned about the contest, the overall results indicate that, most often, participants learned of the events through banner posting (60%) while other routes were reported in much lower proportions.
The contests reported here were held in 51 different countries and engaged new users over existing users at a rate of two to one. Only five program leaders reported estimates of gender distributions, those reports ranged from 2% to 39% female.
If increasing diversity continues to be a central goal of Wiki Loves Monuments events it will be important to better track participant demographics and/or develop additional measures of project or content diversity.
↑Here we examine all contests that reported together, but recognize that not all photo events may share these as priority. Wiki Loves Monuments are a diverse category of programs and reflect a diversity of goals across contexts--we encourage organizers of each event to consider the data in terms of what matters most to their priority goals
How this information can apply to program planning edit
Use the information to help you in planning for program inputs and outputs. The numbers evidenced by others can help you to know what the right combination of participants might be for the content outcomes you are hoping to achieve.
The data presented in this report suggest that small-scale events can be as effective as larger participation events with more new editors while larger participation counts tended to have more media uploaded overall. Having more images however doesn't always mean more image use. When planning a photo event, it may be useful to try to balance group size with both new and experienced users to increase use and ensure high quality uploads.
This information may be especially useful in budgeting and setting reasonable targets
Planning event budgets based on the budgets presented here would have many pitfalls: differences in event length and number of participants, local costs, and even event style affect budgets (see also report on Other Photo Events. For example, a small photo event may comprise of lending a Wikimedian photographer a camera so they can go to a cultural festival and upload the photos they take to Commons. If the event organizer already owns the camera, the event may incur no cost, while purchasing a camera would be a significant expense for another leader who does not already have the resource.
Reach out and connect to other leaders
To avoid surprises, when using budgets presented here for planning purposes, try to find an event in a location with a similar economy to your area and consider reaching out to successful program leader to discuss potential resource needs (including possible budget or donated resources). Alternatively, you can find an event based on the same model in different location and talk to the program leader about the costs incurred before translating those expenses into local prices.
Use the distribution statistics as guardrails against costly plans that may not produce scaled results
The boxplots illustrating cost per participant and cost per media uploaded can also be helpful references, if, as with overall budget information, taken in the context of each event. If planning a new program, you might expect the costs to fall within middle 50% of costs per output reported (ie, within the green bar on the boxplot.) As programs move down the boxplot they create better outcomes with fewer inputs. We hope, as we continue to evaluate programs and feed the results back into program design, that we can learn more from the programs achieving the most impact using the fewest resources.
Over the years, the global event team was very effective in continuously improving and promoting the concept, which led to a growing number of countries joining the competition. The global event team also offers support for local organizers about legal advice and questions around promotion, prizes, and potential partners. Furthermore, the global event team has become continuously more sophisticated in documenting the event and providing the community with tools to measure the competition's results. All of this makes Wiki Loves Monuments the one program within our programmatic landscape that has the most cohesive and clear set of goals, measures of success, and documentation for replication of success.
Still, we are currently examining a selection of events and reaching out to learn more from program leaders in order to develop a program toolkit specifically for gathering the different stories, resources, and advice for how to plan, run, and evaluate photo contests and events. Look for it later in 2015.
How does the cost of the program compare to its outcomes? edit
Given the fact that a comparably large number of organizations receive grant funding explicitly for their Wiki Loves Monuments events, we were able to better establish a basic cost-benefit baseline for Wiki Loves Monuments somewhat better than for any other program covered in this evaluation series. Keep in mind that given we used data mining from grant reports, this section only addresses Wiki Loves Monuments implementations that were funded by grants really, not unfunded implementations.
Our cost-benefit-analysis comprises of data from Wiki Loves Monuments implementations from those events with reported budgets. Looking at the cost per uploaded image from the combined reporting, we get to result that for those events with budgets reported, the average number of uploads is going down while the average cost per upload is going up:
Number of countries
Average budget (median)
Average number of uploaded images (median)
Average cost per uploaded image (median)
As photos that are being used in Wikipedia articles reach a much larger audience than those that are not included in articles, we also looked into the number of images that are currently being used in Wikimedia articles (as of November 4, 2013 for 2012 contest uploads, as of March 30, 2015 for 2013 and 2014 uploads) to see if this demonstrated increased use. As outlined in the table below, image use was actually lower for those which reported budgets than for those which did not for both 2013 and 2014 data.
Knowing the cost for an image that got uploaded through Wiki Loves Monuments and that's also in use on Wikipedia and elsewhere is important as it allows us to compare that number to its equivalent from other photo upload initiatives. Here we see a slight trend toward increased cost alongside decreased impact in this way indicating it is important to be cautious about the investment level for these photo events.
↑ abAs of November 4, 2013. Image usage numbers will increase over time.
↑Calculated for the 21,223 images used from the 11 WLM 2012 contests for which budget was reported
Join the conversation! Visit the report talk page to share and discuss:edit
Questions about Evaluation and Impact
What, if any, ideas do you have about other ways we should evaluate programs?
What questions around program impact or evaluation do you have after reading the reports?
What further data investigations would you like to see (or do!) for these programs?
Questions about Measures
What, if any, measures have you used that are missing from these reports?
What, if any, tools/bots/programs/strategies do you use to measure the outcomes of your programs?
Questions about Sharing
If you ran a program that delivered excellently against goals, speak up! Consider writing a blog or how-to guide highlighting your ideas on why your program was so successful.
If your program surmounted a particularly tricky problem in program design, consider writing a learning pattern!
If you have run a program and want to report key metrics to the Learning and Evaluation team, our collector is always open. Visit our reporting page to learn about the reporting forms contents and find the link to voluntary reporting.
Questions about Connecting
If you are considering running a new program or updating an existing one, consider reaching out to experienced program leaders who have organized a similar program.
Join the program leader mailing list of weekly updates about program evaluation, tools, etc.