Grants:APG/FDC portal/Note on staff proposal assessments 2013-14 Round 1
Methodology behind the 2013-2014 Round 1 FDC staff assessments Edit
The FDC staff assessments provide a snapshot of the staff review of Annual Plan Grant (FDC) proposals, and identify both strengths and concerns with the work proposed.
- We did this through both portfolio analysis - looking across proposals and calibrating across different organizations - as well as assessing the context of each individual organization and its proposal. We did our best to be as nuanced as possible. For example, we understand that an organization that is beginning to formalize structures and bring on staff may have far fewer resources for learning and evaluation processes than an organization with a longer history and full-time staff. At the same time, a larger organization is also likely to have complex program structures and processes, that reflect in its use of metrics and measures.
- We significantly increased the inputs to the staff assessment process from last year. They include:
- The proposals themselves, with the attached documents (strategic plans, annual plans, detailed budgets etc)
- Past and current reports (FDC quarterly progress reports, grant reports etc)
- A portfolio view of financials across proposals: FDC_portal/Proposals/2013-2014_round1/Financial_overview.
- Internal financial, evaluation and compliance inputs from the WMF Finance, Programs, Legal and Grantmaking teams.
- We used a few key principles for the assessment, across all proposals. These are not new, and are detailed in the narrative and scoring sections of the assessment. They include:
- Impact on Wikimedia projects and the global movement. We looked for the rationale behind the work organizations do, aligning them to the Wikimedia strategic priorities and goals. We looked particularly for direct and indirect impact of this work on Wikimedia projects; for example, growth in contributors and content donation. Most importantly, we looked at whether the funds requested were justified in the context of current and potential impact of the proposed work.
- Organizational strategy, leadership and governance. We looked for evidence that an organization had a good understanding of its own context, and the needs, opportunities and challenges of the communities it seeks to serve or partner with. We looked for board, staff and volunteer leadership that is committed, effective and engaged with these communities and practising movement values of openness and transparency.
- Rates of growth and financial management. In this round, only two returning applicants asked for funds within the recommended maximum growth rate in movement resources (annual plan grants/FDC allocations) of 20%, which is itself a significant rate in many contexts. In addition, many organizations are underspending on previous grant allocations and/or have extensive reserves, which makes overall growth rates even higher. We appreciated organizations that were not spending for spending’s sake, and being careful and prudent about resources; we hope this continues. However, significant underspends (particularly on programs) across a number of organizations flag the challenges of budgeting, planning and program design. It may be that the most effective and impactful programs do not always need significant resources. We specifically looked at the nature of long-time underspends in the context of organizations with a history of overbudgetting and underspending.
- Program design, learning and evaluation. We recognize that the Wikimedia movement at large is still developing a more nuanced understanding of how we measure our impact, as volunteers and as organizations. We looked at each organization’s self-defined metrics and measures for success, both qualitative and quantitative. We searched for indicators that went beyond process (or outputs) and looked at intended outcomes and specific goals or targets. We recognized, however, that many organizations are in the process of establishing baselines for their work, and incorporating learning into program design and re-design. We appreciated organizations that were open about these challenges and were honest and reflective about their own capacities and systems.
Note: We shared our nearly final draft assessments with all FDC applicants a day ahead of publishing them, so that organizations had some lead time for sharing these with their Board and staff. We then took all responses into consideration where possible, making factual changes and clarifying our statements where necessary. We encouraged organizations to note all substantive comments and concerns on the Discussion pages of these assessments.