Grants talk:IEG/Learning/Round 1 2013/Impact
Latest comment: 10 years ago by Jwild in topic People reached
How is impact defined?
editHow is impact defined and how are impacts compared among projects? --Pine✉ 08:06, 7 February 2014 (UTC)
- Hi there Pine. Impact is defined in terms of the Wikimedia strategic goals (in this case Reach, Participation, and Quality). Performing impact assessments like this one after each IEG round will help us develop metrics for comparing levels of impact between projects that focus on the same goals. Ideally many of those metrics will be quantitative and anchored in relevant external research and top-line movement metrics. Jmorgan (WMF) (talk) 01:13, 8 February 2014 (UTC)
- Hi Jmorgan. The basis for my question is a statement I heard that The Wikipedia Library had the most impact of all IEG Round 1 projects. To make that statement there must be a single metric for impact that can be used to measure impact for all projects, right? --Pine✉ 03:34, 8 February 2014 (UTC)
- We think that The Wikimedia Library (TWL) is demonstrating high impact potential - i.e., bringing in lots of resources and distributing at scale, which can increase the quality of our content (a strategic priority). Not sure where you heard the direct comparison, beyond perhaps that this project, more than some others, has clear demonstrations of payback, usage, and scalability. Hope that helps! Jwild (talk) 17:45, 10 February 2014 (UTC)
- You raise a good point about TWL, though. TWL has provided access to resources that have known monetary value, and a lot of them. So while it would be easy to focus solely on that metric, such a focus would be misleading: monetary return on investment will not be an appropriate metric for many (most?) of our projects. What TWL does exceptionally well, in terms of potential for impact after the grant period, is lay out a clear roadmap for continuing the work, and show how the infrastructure built by the grant (in terms of relationships with orgs, as well as on-wiki project spaces and community involvement) will help sustain TWL. I tried to capture the importance of effective planning for scalability & sustainability, which is a feature of several of the Round 1 IEGs, in these learning patterns: Community impact and Firm foundation. Jmorgan (WMF) (talk) 21:01, 10 February 2014 (UTC)
- OK. I'm looking for ways to make quantitative projections about impacts of proposed projects using what has been learned from previous projects and I'm looking for ways to use what we learn from reports like this one to develop and evaluate future proposals in quantitative ways. Does that make sense? --Pine✉ 08:18, 11 February 2014 (UTC)
- That makes sense to me, at least. I would like to be able to compare impact in different areas across projects with the same general goal. I do not know of anything yet like an 'impact data sheet' for projects that summarizes their impact along a set of common dimensions. For instance, for projects aiming to support existing editors, it would be helpful to see estimates of the monthly use of tools or processes developed, amount of participation in on-wiki spaces that were created, number of editors affected, number of articles or media affected, hours/month of volunteer effort needed to continue the project (in the future). Projects that aim to improve quality could similarly have estimates of their impact on quality. For a project like TWL, where the value seems obvious yet it's hard to distinguish edits that benefitted from it from those that did not, some quantitative estimates could be very crude, or simply 'not estimated'. And a metric such as "In-kind donation value" could appropriate here, but not for many others, as Jmorgan notes. –SJ talk 20:35, 12 March 2014 (UTC)
- OK. I'm looking for ways to make quantitative projections about impacts of proposed projects using what has been learned from previous projects and I'm looking for ways to use what we learn from reports like this one to develop and evaluate future proposals in quantitative ways. Does that make sense? --Pine✉ 08:18, 11 February 2014 (UTC)
- Hi Jmorgan. The basis for my question is a statement I heard that The Wikipedia Library had the most impact of all IEG Round 1 projects. To make that statement there must be a single metric for impact that can be used to measure impact for all projects, right? --Pine✉ 03:34, 8 February 2014 (UTC)
Translation tool...
edit"Translation Extension Currently only works from English to other languages; desire for the ability to translate from any language to any language" — what is this about and where did it come from? --Gryllida 11:37, 7 February 2014 (UTC)
- I guess it s about Translate. If I recall correctly, the source language can be changed in the site's configuration - the problem is that it cannot be different page by page. whym (talk) 15:19, 7 February 2014 (UTC)
- yes - that's what it is about, and it comes from the final report for the wikiArS project, under "Translation and internationalization issues." Jwild (talk) 17:48, 10 February 2014 (UTC)
- Is that the only grant that ran into this issue? There is a known solution to this bug; if people who need it push for it, it should not be impossible to resolve. See the bug discussing this: bugzilla:35489 –SJ talk 20:35, 12 March 2014 (UTC)
- yes - that's what it is about, and it comes from the final report for the wikiArS project, under "Translation and internationalization issues." Jwild (talk) 17:48, 10 February 2014 (UTC)
Overview video
editHey everyone - we had a Google Hangout today to discuss some of these findings. Please see the video if you are interested in the overview! Thanks. Jwild (talk) 18:06, 19 February 2014 (UTC)
- Thanks for this overview and posting the links. I find it very useful! I am not sure what I was expecting, but this is great in that it shows the measurements of what is measurable. The next step is to dream up more ways of measuring and see if this can be done with the data already collected. Jane023 (talk) 09:43, 27 February 2014 (UTC)
- Thank you! There are a lot of next steps, and I agree a lot of it has to be around thinking through what are the most critical things we want to measure, and how do we track those longer term. If you are interested in helping think through some of this stuff, stop by the evaluation space on meta or our facebook group (if you have an account there). See you around :) Jwild (talk) 14:47, 27 February 2014 (UTC)
People reached
editI hear that IEG programs reached 20 thousands people. I find no such number in this report, where does it come from?
- There is a mention of 10k Twitter (Weibo) followers, but I suppose they're not included in the count in question? I don't see the tens of thousands followers of other Microblogging handles and Facebook pages included in the APG and PEG reports, nor I see what sense it would make to consider passive connection as "participation" or "people reached".
- Did you include the participants to the Wikisource birthday/proofreading contest? The it.source component, which accounts for most of the participation, was separately funded by WMIT, so it's inappropriate to count it here (while it was on topic as mention in the project report).
--Nemo 18:16, 2 August 2014 (UTC)
- Includes all people touched: so, yes, including the Weibo 10K and also the people touched through The Wikipedia Adventure. For this year, we counted anyone who was included in the reports back. For next year, we are working on a consistent way of gathering this information across grants programs. Jwild (talk) 15:26, 4 August 2014 (UTC)