Grants:IdeaLab/Application scoring system
|join other ideas|
What is the problem you're trying to solve?Edit
Individual Engagement Grants review process currently uses an ad hoc scoring system involving Google Forms to collect input from reviewers. It is unwieldy and won't scale well as we have more proposals to review.
Findings from initial investigations into alternative solutions:
- The movement has demonstrated a need for scoring tools - many other programs across the Wikimedia movement are also using Google Forms to score applications for things like scholarships.
- Outside of our movement, other grant programs that use group review processes on large numbers of applications often do so in-person - applications are physically scored and marked.
- Wikimania scholarships rely on a scoring system that is hosted by WMF. This system appears to have functionality that could meet needs for IEG and other programs where scoring is needed, but the current application is not flexible enough to be extended for uses beyond Wikimania scholarships. A new application could be built off of this existing code base, if necessary.
What is your solution?Edit
Build a flexible scoring system to meet our needs.
- Develop a scoring solution to meet the needs of IEG review by end of September 2014.
- Make this solution available for other programs where scoring is needed.
Analysis of the problem would start from a few core user stories:
- As an administrator of the grant review system
- I want to create new grant campaigns (defining reviewers, review criteria, data to be reviewed, and data grouping)
- So that reviewers can provide feedback on the applications in the campaign.
- As a grant administrator or reviewer
- I want to mark grant applications as eligible or ineligible
- So that ineligible grant applications can be excluded from further review.
- As a grant reviewer
- I want to score a grant application on multiple dimensions (defined by the admin)
- So that the quality of the grant application can be compared to others in the same campaign.
- As a grant provider or administrator
- I want to view reports on the aggregate reviewer scores for grant applications in a campaign
- So I can share results with others outside of the review system and select applications to fund.
The core user stories would be fleshed out with additional details based on interviews with the IEG team and other interested parties within the Wikimedia movement to produce a list of core requirements. (for example, an IEG committee member has already started a rev on a spec, which could be discussed and incorporated)
Decide on a solutionEdit
The second phase would seek to make an analysis of existing FOSS applications that could satisfy these core requirements. This analysis would be used to make a "build versus buy" decision on whether to create a solution based on existing software or undertake the creation of a brand new project to satisfy the requirements.
The third phase will be the implementation of the build versus buy decision. The activities needed here will obviously vary based on which avenue is chosen, but will likely require some software development in either case. If an existing application is found to be a close match the third phase would involve configuring the application to satisfy the requirements and be deployable to the Wikimedia Foundation production cluster. Implementation of an existing product would quite likely also involve fixing small to medium complexity upstream bugs and adding small to medium size features. If instead the decision is to build a new solution the third phase would involve creating a new project that leverages as much existing code as is reasonable from the Wikimania Scholarships application.
- A GSoc or OPW participant is currently being sought to take on this project. If you are interested as a GSoC student or OPW intern, please follow the processes for getting involved as defined by those programs.
- If you'd like to help advise on this project once it gets started, please add yourself to the list of participants!