Grants:Programs/Wikimedia Community Fund/Committee review process and framework

Other languages:
How are General Support Fund proposals reviewed for Wikimedia Community Fund?

The Wikimedia Foundation Funds exist to support the strategic direction of the movement. We want to build a relationship of partnership with all of the people and organizations that are supported through these programs. For this reason, it is important to explain the review process within the new Funds strategy.  Partners in the process: Applicants, Community Resources regional program officers and Regional Funds Committees.

The General Support Fund is being reviewed with the new Regional Funds Committees that were established in July 2021. These committees are decentralised and participatory decision-making bodies, composed of experienced and newer community members, and organised in 7 regions. For more information about the Regional Committee process and its current members you can click here and read this Diff article.

Our goal is to work with each applicant to explain decisions clearly. In the event the recommendation is to not fund the proposal as submitted we are committed to working with the applicant to ensure they understand what would be needed in the future and to support them in that journey.

A brief overview of the review process:

  • Applicant submission: Applicants submit their proposal through the new grantee portal, these are available on Meta for general community review and comments.
  • Due diligence: The Community Resources team organises the proposals and supporting documents for committee review. These supporting documents include any additional documents the applicants included in their proposal, such as strategic or annual plans, as well as background information about the organisation prepared by Program Officers based on several sources of information: such as Affiliate reports and surveys, review of past qualitative and quantitative results (from grant reporting), budget and staff growth, etc.
  • Regional Committee review: Committee members take some time to review these proposals and supporting documents individually based on a series of criteria and questions. The categories and questions that can be reviewed below.
  • Initial feedback : During this phase, committees may publish questions on meta so that applicants can clarify any aspects of their proposal. Community members in general may also be posting comments, questions or feedback that applicants should respond to via meta.

Structured feedback from Regional Committee: The whole committee meets to discuss each project to consolidate a unified feedback document that will be sent to applicants. This feedback may include suggestions for adjustments in certain areas of the proposal or questions around issues that were unclear and the committee would like to understand further to review the proposal.

  • Staff feedback shared with committee: At this stage Program Officers also share their Staff review of each proposal with Committee members. This analysis seeks to offer additional information or insights about the proposal based on Program Officers engagement with applicants throughout the process, but also other Foundation staff knowledge around thematic areas and learning from other programs and experiences. This may include analysis about the organizational growth and impact over time, the clarity of their approach and strategies, their staff and budget distribution, etc. Committees should feel ownership of the decision and be empowered to question and use the analysis from Foundation Staff as another perspective.
  • Committee/Applicant meetings: In certain cases, committee members may request a live session with the applicants to discuss this feedback and ask questions. Program Officers may also organise these spaces to provide applicants with support to review this feedback.
  • Applicant responses and revisions: Applicants have a set time to make necessary adjustments or clarifications. These should be done directly on Fluxx as the final proposal.
  • Deliberations: Committee members hold a second round of formal deliberation sessions to make their final decisions on the funding. In this stage they take into consideration all the recommendations and adjustments, as well as the overall budget for the region. In the final Committee recommendation, some further recommendations may be made to applicants to support their implementation work, or for future proposal development.
  • Committee funding decision and remarks: The applicants are informed of the funding decision via email, Fluxx and Meta and, if approved, begin the grant administration process.

The primary questions the Regional Fund Committees are considering when reviewing the applications are:

  • What is the planned impact of the proposal and how does that support the strategic direction of the movement?
  • Does the proposal clearly articulate how they plan to achieve the proposed impact, is it feasible and financially sound?

Criteria 1: General clarityEdit

1. How clear and coherent is the proposal in terms of the change it wants to make and what it thinks will help bring about that change?

2. How viable is the proposal in terms of the strategies and activities it proposes in the timeframe established? Have they identified major potential risks and relevant mitigation strategies? Does their timeframe have flexibility to allow for these mitigation strategies if needed?

Criteria 2: Impact Potential & Strategic AlignmentEdit

3. Has the proposal clearly stated how it hopes to address Knowledge Equity? Has it clearly identified how it will develop Knowledge Equity both in its general strategies,  activities and budget?

4. If the applicant has an annual or a strategic plan, does the proposal relate to the organization’s strategic goals and vision?

5.  Does this proposal have support from Wikimedia community members? Does it address needs that have been identified and/or prioritized by the community? Has there been sufficient engagement and/or inclusion of community members at appropriate points in the design, implementation and evaluation plan? *Consider this evaluation according to the scope and experience of the applicant.

6. Does this proposal hope to bring in diverse newcomers (new participants) and have clear strategies to foster their retention and growth within the movement? Has consideration been given to creating an inclusive environment that supports safety and belonging?

7. Does the proposal contribute to the Movement Strategy 2030 recommendations?

8. Does the proposal include strategic partnerships? For example: partnerships that address the proposal’s goals and have potential to contribute in the longer term? Not relevant for all applications.

Criteria 3: Organizational capacity and budgetEdit

9.  Does the applicant have the experience and organizational capacity to implement this project? . If it is a new grantee or project – has their application accounted for risks?

10. Does the applicant have the team/roles needed to support the successful implementation of the proposal? Does the team have the necessary skills or do they have a plan to outsource these? Does the applicant show any relevant past experience that evidences this team/organizational capacity?

11. Does this proposal take an innovative approach that would contribute to our learning in the Wikimedia movement? If so, what do you think is most important about the learning potential of this application?

12. Does the proposed budget adequately reflect the investment needed to achieve the proposed goals?  Is it sufficiently clear to understand how the resources are going to be invested? Does the budget include investments in key areas such as learning and evaluation?

Does the budget consider equity issues and needed investments to reach goals regarding greater equity ? For instance, equity in staff/volunteer compensation and welfare, investments needed to engage underrepresented participants and support the work of movement organisers.

Criteria 4: Learning and evaluationEdit

13. Does the applicant show clarity in what they hope to learn from their work given the change they are hoping to achieve?? Does the applicant include qualitative metrics (data/information) that will help them gather the necessary information to learn from their work and is there clarity on how this will be collected?

14. Does the applicant define quantitative metrics metrics that are relevant to their work (i.e related to what they want to learn, ambitious and realistic enough given the funding proposed)? Do they propose adequate tools to measure this?