Grants:Project/CS&S/Structured Data on Wikimedia Commons functionalities in OpenRefine/Final


Report accepted
This report for a Project Grant approved in FY 2021-22 has been reviewed and accepted by the Wikimedia Foundation.



Welcome to this project's final report! This report shares the outcomes, impact and learnings from the grantee's project.

Please note that this final report collects opinions and reflections by Sandra Fauconnier, who worked as Product Manager for this project. I (Sandra) warmly welcome reflections by other stakeholders too, and will be happy to engage in follow up conversations.

Part 1: The Project

edit

Summary

edit
 
OpenRefine logo in Wikimedia colours, created by a Wikimedian as icon for the OpenRefine-Wikimedia Telegram channel

Thanks to this Project Grant, OpenRefine can now be used to batch edit and batch upload Wikimedia Commons files, with an emphasis on adding structured data (Structured Data on Commons / SDC).

  • Batch edits to Wikimedia Commons files can be performed with OpenRefine version 3.6 (and newer versions).
  • A dedicated Wikimedia Commons extension for OpenRefine provides additional features for Wikimedia Commons users:
    • The ability to start an OpenRefine project from one or more Wikimedia Commons categories
    • Dedicated scripting (GREL) commands to extract data from Wikitext
    • Thumbnail previews of Wikimedia Commons files
  • Batch uploads of files to Wikimedia Commons can be done with OpenRefine version 3.7 (and newer versions).

You can learn more about this at Commons:OpenRefine.

A cloud version of OpenRefine 3.6 is also directly available to Wikimedia users via PAWS.

 

Project Goals

edit

Our two project goals were the following. The project received a 4-month extension; we list the results which were achieved by end October 2022.

 
Presentation about upcoming Commons features in OpenRefine at WikidataCon 2021 (October 2021)

By mid-2022, there will be a cross-platform, free and open source tool – OpenRefine – which allows Wikimedians to edit and upload large and diverse batches of files with structured data on/to Wikimedia Commons.

We achieved this goal by delivering various pieces of software:

By mid-2022, Wikimedians who are new to OpenRefine have a fast learning curve to using this tool on Wikimedia Commons, because they have access to good training materials and documentation.

This goal was partly achieved, but needs more work. We delivered basic documentation. In the course of this project, we focused on delivering useful software features and testing these; we haven't yet had time to develop the more extensive training materials that will unlock broader adoption of OpenRefine in the Wikimedia and GLAMwiki communities. We did deliver:
OpenRefine is not a wizard-style tool (like Pattypan). It is a very powerful 'swiss army knife' for data, and will always have a certain learning curve. User testing during this project has shown that Wikimedia / Wikidata power users, and GLAM staff experienced in databases, most easily onboard with OpenRefine. The tool is generally more challenging for 'vanilla' Wikimedians and for staff of smaller GLAMs, especially if these people are not accustomed to working with (structured) data and databases.

Project Impact

edit

Important: The Wikimedia Foundation is no longer collecting Global Metrics for Project Grants. We are currently updating our pages to remove legacy references, but please ignore any that you encounter until we finish.

Targets

edit
  1. In the first column of the table below, please copy and paste the measures you selected to help you evaluate your project's success (see the Project Impact section of your proposal). Please use one row for each measure. If you set a numeric target for the measure, please include the number.
  2. In the second column, describe your project's actual results. If you set a numeric target for the measure, please report numerically in this column. Otherwise, write a brief sentence summarizing your output or outcome for this measure.
  3. In the third column, you have the option to provide further explanation as needed. You may also add additional explanation below this table.
Planned measure of success
(include numeric target, if applicable)
Actual result Explanation
(Goal 1, Output) We will build a Wikimedia Commons SDC reconciliation service. Built and delivered: https://commonsreconcile.toolforge.org (finalized in February 2022).
(Goal 1, Output) We will develop OpenRefine functionalities that allow to batch edit the structured data of diverse sets of files on Wikimedia Commons. Built and delivered:
(Goal 1, Output) We will develop OpenRefine functionalities that allow to batch upload files with structured data to Wikimedia Commons. Built and delivered:
(Goal 1, Outcome) After August 2022, Wikimedians without coding skills can do large-scale editing and uploading projects with SDC thanks to the availability of OpenRefine as the first all-round, cross-platform and open source tool to support this functionality. Wikimedians without coding skills can now indeed use OpenRefine for batch edits and uploads on Wikimedia Commons, with SDC.
(Goal 1, Outcome) For the longer term, the new functionalities provided by OpenRefine strengthen Wikimedia’s strategic goals around offering knowledge as a service, as mentioned in the 2030 strategic direction. GLAM institutions and other external organizations that want to share free knowledge with the world are encouraged to contribute new media files with rich metadata to Wikimedia Commons because the upload process is streamlined via a tool that many of them already use, and because the addition of multilingual, structured data to their files significantly increases their discoverability and impact. OpenRefine, a widely used tool for data cleaning and manipulation in the cultural sector, has indeed been extended with batch features for Wikimedia Commons. By the time of writing this report, more than 6 GLAM institutions (counted via EditGroups [1]) have already performed batch edits and uploads on Wikimedia Commons (before the official release of OpenRefine 3.7). This target is very ambitious and focuses on the longer term, and some aspects of this target are beyond the control of the team which worked on this grant (specifically the discoverability and impact of structured data). Time will tell whether this project has indeed contributed to this goal!
(Goal 2, Output) We will develop online training materials for Wikimedians who want to teach themselves to work with OpenRefine for editing and uploading SDC-enhanced files on Wikimedia Commons.

To feed this (permanently available) online training material, we will also organize a set of online webinars. The video recordings of these webinars will also be edited and made available online.

First training materials (how-to guides) are available, and OpenRefine for Commons has already been demonstrated and taught in various online workshops. See the full list under Project resources in this report. More work in this area will be welcome though. After the official release of OpenRefine 3.7 it will be helpful to improve these first training materials, and to give new formal training courses, including a Train the Trainers program.
(Goal 2, Output) We will create documentation pages on using OpenRefine for editing and uploading SDC-enhanced files on Commons. OpenRefine's own manual includes documentation on editing Wikimedia Commons. Documentation on file upload is forthcoming.
(Goal 2, Outcome) Wikimedians have access to clear documentation and training materials that allow them to get started with batch editing and uploading SDC-enhanced files on Wikimedia Commons. See the two Goal 2, Output items above.
(Participation or content) At least 20 individual people and/or organizations have provided input and feedback during the development process of OpenRefine’s new Wikimedia Commons features, including representatives of at least 3 international GLAM institutions.
  • User testing sessions as part of UX research: 15 interviews, of which 10 from international GLAM institutions
  • Test uploads (outside of UX research) by at least 5 additional people
  • 5 Wikimedians and GLAM staff (not part of the OpenRefine team) submitted GitHub issues and feature requests related to Wikimedia Commons for OpenRefine
(Participation or content) At least 100,000 Wikimedia Commons files have received some new structured data in the first six months of 2022, thanks to OpenRefine’s new editing functionality released around end 2021. (This can be measured because the edits through OpenRefine will have a dedicated tag.) At the time of writing this report (early December 2022), more than 150,000 edits have been done on Wikimedia Commons with OpenRefine, and more than 20,000 files have been uploaded there.
(Participation or content) At least 20 Wikimedians have edited structured data of media files on Wikimedia Commons, by using the newly developed editing functionality in OpenRefine in the first six months of 2022. 19 individual Wikimedians (outside the OpenRefine team) have edited structured data on Commons with OpenRefine before end June 2022 (counted manually in the EditGroups tool)
(Participation or content) At least 30 individual people and/or organizations have participated in OpenRefine’s new feature webinars, including representatives of at least 3 GLAM institutions. We forgot to count participants of several presentations, but this number was certainly exceeded, as two presentations certainly had 40+ and ~20 participants, including representatives of at least 10 cultural institutions.


Story

edit

Looking back over your whole project, what did you achieve? Tell us the story of your achievements, your results, your outcomes. Focus on inspiring moments, tough challenges, interesting anecdotes or anything that highlights the outcomes of your project. Imagine that you are sharing with a friend about the achievements that matter most to you in your project.

  • This should not be a list of what you did. You will be asked to provide that later in the Methods and Activities section.
  • Consider your original goals as you write your project's story, but don't let them limit you. Your project may have important outcomes you weren't expecting. Please focus on the impact that you believe matters most.
  • We deployed lots of brand new features in the course of a year! It is now possible to batch edit and batch upload Wikimedia Commons files with OpenRefine, with an emphasis on structured data.
  • We advertised the new features quite early, while they were still under development, and asked users to test them for us. Many users bravely downloaded and tested OpenRefine’s snapshot releases (unstable daily builds), provided us with feedback and bug reports. This has already resulted in 150,000 edits on Wikimedia Commons at the time of writing this report, and over 20,000 uploaded files.
  • Internally, we experimented with the engagement of a UX designer, Lozana Rossenova, who provided the project with a critical external perspective and sounding board. She conducted extensive user research (surveys, test sessions, user interviews), produced detailed wireframe designs, and overall helped us to prioritize new features. Quite a few of the features that we deployed (thumbnails, schema templates) are a direct result of her input.
  •  
    Improved dialog window to add and select Wikibase instances in OpenRefine
    As a very useful by-product, this project has paved the way for better OpenRefine features for arbitrary Wikibases as well. This Wikimedia project grant has also helped strengthen OpenRefine’s relationship to the Wikibase Stakeholder Group.
    • With this Wikimedia grant, we improved the dialog window in OpenRefine that allows to select and add any Wikibase (not only Wikidata and Wikimedia Commons).
    • With this Wikimedia grant, we also introduced a renewed format for editing schemas in OpenRefine (the templating area in which the data model for an import in Wikimedia Commons and any Wikibase is prepared); this has now become more flexible, can accommodate Wikitext, and schemas can now be shared and imported/exported autonomously ("schema templates"). Both features (this one and the one in the bullet point above) benefit the Wikibase ecosystem at large.
    • In 2022, OpenRefine received a grant from NFDI, the German National Research Data Infrastructure to also develop media file import into arbitrary Wikibases (i.e. generalizing file upload functionalities to any Wikibase, building upon the already developed Wikimedia Commons upload features).

Survey(s)

edit

If you used surveys to evaluate the success of your project, please provide a link(s) in this section, then briefly summarize your survey results in your own words. Include three interesting outputs or outcomes that the survey revealed.

 
Summary and analysis of user survey, April 2022

N/a (no survey was held for project evaluation, but we performed a survey to learn more about user needs).

Other

edit

Is there another way you would prefer to communicate the actual results of your project, as you understand them? You can do that here!

N/a

Methods and activities

edit

Please provide a list of the main methods and activities through which you completed your project.

  • For a detailed overview of all activities in this project, see our monthly updates (collected from June 2021 until October 2022).
  • Methods and activities until the end of 2021 have been listed in our grant's midpoint report. Below, we list some major highlights of what happened in 2022.

Design research, user testing

edit
  • Outcomes of initial design research in 2021 have greatly influenced feature prioritization in this project. This grant's midpoint report describes the process and decision-making that took place in 2021; in 2022 we acted upon those insights. Thanks to the additional design and software development budgets awarded in our four-month (July-October) grant extension, we have been able to develop quite a few additional features that improve Wikimedia Commons users' experience in editing and uploading Commons files with OpenRefine.
  • In 2022, we held a user survey and two rounds of user testing, to further identify the highest-priority areas of improvement. We have acted upon quite a few of these findings. In particular, the user tests inspired us to further prioritize the feature of schema templates. These are data modeling templates inside OpenRefine that help Wikimedia Commons editors and uploaders to work with common scenarios, such as uploading/editing files that show artworks and books.

Software development

edit
  •  
    An OpenRefine schema for Wikimedia Commons, where the structure (data model) for the upload is being prepared.
    Our team decided to create a separate Wikimedia Commons extension for OpenRefine, to accommodate various features that are specific to Wikimedia Commons:
    • Users can start OpenRefine projects from one or more Wikimedia Commons categories, including category depth.
    • Users now see thumbnails of media files on Wikimedia Commons.
    • The Wikimedia Commons extension offers various Wikimedia Commons-specific GREL commands (a simple scripting language inside OpenRefine) to allow extraction of various pieces of Wikitext from existing Wikimedia Commons files.
  • In OpenRefine's own code base, and in its Wikibase extension, various foundational architectural changes have been made, including:
    • Wikibase entities are now managed more flexibly inside OpenRefine's Wikibase extension. This also laid the (general) foundation for better support for arbitrary Wikibases, and unlocks potential future support of other entity types such as Lexemes.
    • There are now more refined editing options in OpenRefine's schema editor, including the possibility to (not) overwrite and even delete Wikibase / Wikidata / SDC statements.
    • The technical architecture for the 'grid view' in OpenRefine has been updated and extended, unlocking the possibility to display thumbnails.
  • We also contributed updates to Wikidata Toolkit.

Community and stakeholder outreach; documentation and training

edit
  • We held several self-organized info sessions, training sessions and office hours, but also gave training upon invitation on multiple occasions. See the list of these below under 'Project resources'.
  • We actively invited quite a few people to test OpenRefine's new Wikimedia Commons batch editing and upload features, using early snapshot releases of OpenRefine 3.6 and 3.7. This process ran parallel to the more formal user testing sessions and has greatly helped us to discover and solve bugs along the way. We warmly thank all those brave and adventurous testers!
  • We developed some basic documentation (how-to guides) for editing and uploading (to) Wikimedia Commons at Commons:OpenRefine, and in OpenRefine’s own manual. This can still be improved and enhanced.

Project resources

edit

Please provide links to all public, online documents and other artifacts that you created during the course of this project. Even if you have linked to them elsewhere in this report, this section serves as a centralized archive for everything you created during your project. Examples include: meeting notes, participant lists, photos or graphics uploaded to Wikimedia Commons, template messages sent to participants, wiki pages, social media (Facebook groups, Twitter accounts), datasets, surveys, questionnaires, code repositories... If possible, include a brief summary with each link.

General wiki pages

edit

Wikimedia Commons reconciliation service

edit

OpenRefine

edit
  • Code and issue tracker (GitHub)
  • Releases
  • Documentation
  • Workboard on Phabricator (for OpenRefine in general, not just for Wikimedia Commons; main issue tracker for OpenRefine is on GitHub)

Wikimedia Commons Extension for OpenRefine

edit
 
EditGroups tool - OpenRefine on Wikimedia Commons batches

EditGroups for Wikimedia Commons

edit

Wikimedia Commons manifest for OpenRefine

edit

User research and UX design

edit

Webinars, presentations, workshops, training materials

edit

By Wikimedians outside the OpenRefine core team (presentations given after the grant period, but invited during the grant period):

OpenRefine-Wikimedia communications channels

edit

Meeting notes and general documents

edit

Statistics

edit

Learning

edit

The best thing about trying something new is that you learn from it. We want to follow in your footsteps and learn along with you, and we want to know that you took enough risks in your project to have learned something really interesting! Think about what recommendations you have for others who may follow in your footsteps, and use the below sections to describe what worked and what didn’t.

What worked well

edit

What did you try that was successful and you'd recommend others do? To help spread successful strategies so that they can be of use to others in the movement, rather than writing lots of text here, we'd like you to share your finding in the form of a link to a learning pattern.

We endorsed the following learning patterns. Both were relevant for this project.

  • Learning patterns/Set test tasks for recruiting programmers
    • This is great advice. Looking back, we should have done this more thoroughly while recruiting developers for this project, to better understand the skills and strengths of each candidate. It would especially have been useful to assess each candidate’s working style and basic programming knowledge, to check whether they easily process and apply insights from existing technical documentation, and how confidently the candidates engage with OpenRefine’s existing code base.
  • Learning patterns/Conducting user experience research
    • We have engaged UX design and user research in this specific project; this has brought us valuable insights for feature prioritization.

Other things that worked well in this project / grant period:

  • We delivered, and people have started adopting the new features that we have built!
  • We were able to engage with communities beyond the immediate target group of the grant (e.g. NFDI / Wikibase Stakeholder Group), which in turn allowed to extend certain features and develop patterns of interaction valuable to a broader OpenRefine user base.

What didn’t work

edit

What did you try that you learned didn't work? What would you think about doing differently in the future? Please list these as short bullet points.

  • At the start of this project, we decided to recruit junior-level (not mid- or senior-level) developers. The OpenRefine team is keen and happy to provide mentorship and training to open source developers in general; OpenRefine also participates in Outreachy and Google Summer of Code programs. Both developers that we hired produced very valuable software from scratch: a full-fledged reconciliation service for Wikimedia Commons, and a dedicated Wikimedia Commons extension which provides very useful Wikimedia Commons-specific features. That said, work on this project would have progressed more quickly if more senior, autonomous developers would have been hired, especially developers who would have felt confident to contribute to OpenRefine’s core code base and to take ownership of that work.
  • As we already mentioned in the project’s midpoint report: we eventually decided to not develop the (optional, generic) separate upload tool which would allow for larger-scale file uploads. In the context of this grant, this was probably a good decision; it has allowed us to focus more strongly on features inside OpenRefine and its Wikimedia Commons extension. Until now, OpenRefine has not yet been stress-tested with extremely large upload batches (think 10,000s to 100,000s of large files at once). As OpenRefine is used more frequently for Commons batch uploads in the future, experience with such large uploads will teach us about possible blockers (if any), and if a separate upload tool for large batches will indeed be useful.
  • We published extensive and detailed how-to instructions, but… it is quite clear that not every user reads these. While doing their first uploads, users often asked questions which were already answered in documentation, or made beginners’ mistakes for which clear warnings are actually written up in the how-tos. We implemented some error messaging to avoid common pitfalls, such as mistakes in file names and file paths; more such error messages and workflow support may be needed in the future (e.g. to make sure that users always include licenses and Commons categories in their uploads).
  • In the first half of 2022, we decided to organize monthly office hours at set times, but turnout for these was inconsistent. We decided to not continue organizing these. Instead, we have provided workshops, training and presentations upon specific requests from Wikimedia communities and other stakeholders.

Other recommendations

edit

If you have additional recommendations or reflections that don’t fit into the above sections, please list them here.

OpenRefine, Pattypan, the GLAMwiki Toolset, and other Wikimedia Commons upload tools

With this project, OpenRefine has become a “new player” in the field of batch (upload) tools for Wikimedia Commons. OpenRefine is now a candidate to (partly) replace the (advanced, and now dysfunctional) GLAMwiki Toolset. With its advanced native data cleaning features, its emphasis on structured data, and its inbuilt import functionalities from external databases (e.g. starting projects from web APIs and from many different data formats in general), OpenRefine is quite suitable for adoption by partner organizations and Wikimedians who want to upload files on Wikimedia Commons directly from external sources.

That said, during user testing, quite a few users (including representatives from small cultural institutions) have expressed that they found OpenRefine quite difficult to grasp and that they prefer the easier, wizard-based approach of the Pattypan tool. ABES, a French library umbrella organization, is investing in software packaging for Pattypan, despite being informed about the existence of OpenRefine and its features relevant for GLAMs. On Commons:OpenRefine, we have listed a comparison table of OpenRefine and Pattypan features to help users decide which tool fits their use case best.

Going forward, it will be useful to discuss the Wikimedia movement’s investment in various upload tools for various use cases.

Next steps and opportunities

edit

Are there opportunities for future growth of this project, or new areas you have uncovered in the course of this grant that could be fruitful for more exploration (either by yourself, or others)? What ideas or suggestions do you have for future projects based on the work you’ve completed? Please list these as short bullet points.

Further development of Wikimedia features in OpenRefine: could this be your project?

edit

This project has revealed limitations in the capacity of OpenRefine’s own (small) team, specifically in the amount of support that the core OpenRefine team is able to provide to specific user communities. Going forward, OpenRefine's core developer team will focus on OpenRefine's central data cleaning features and on reducing the project’s general technical debt, making sure that the tool also generally stays operational and serves the general use cases of its (broad, largely non-Wikimedia) user base. The OpenRefine team very much wants to grow its technical contributor community; OpenRefine welcomes new prospective Wikimedia and Wikibase maintainers (either individuals or organizations, or both) to step up, and will be happy to onboard them. Further maintenance and new development for Wikimedia features in the OpenRefine ecosystem depends on new initiatives from the Wikimedia community at large.

In preparation, OpenRefine’s code contributors are laying some groundwork with a proposal for separation of OpenRefine’s Wikibase extension to its own repository, and suggested improvements to OpenRefine’s general extensions architecture.

As a follow up to this project, and to help prospective new Wikimedia contributors, we will list potential new features at OpenRefine/Wikimedia Commons feature ideas (end 2022).

Since OpenRefine is a key tool for the general Wikibase ecosystem as well, there is already some initiative in this specific area: the code repository for the Wikibase reconciliation interface has been transferred to NFDI4Culture, the German Consortium for Research Data on Material and Immaterial Cultural Heritage.

The complexities of batch operations on (structured data on) Wikimedia Commons

edit

In this project, we have learned a lot about the specifics of batch operations on Wikimedia Commons, and about the factors that make this easier or harder.

In particular: contributing to Wikimedia Commons relies on many complex (legacy, pre-structured data) conventions, and there is a general lack of new conventions and (platform) infrastructure to encourage and support broader adoption of structured data. We have clearly noticed that this makes it difficult to develop easy-to-use, intuitive batch upload and contribution tools.

  • Files on Wikimedia Commons must still be uploaded with Wikitext alongside structured data. It is still mandatory to include license templates and Wikimedia Commons categories in Wikitext. There are a few generic, relatively easy-to-use, simple, Lua-driven, fully SDC-powered Wikitext information templates, but these mainly depend on the development efforts of only one Wikimedia Commons contributor.
    • During user tests, we have noticed that long-time Wikimedia Commons contributors still “think in Wikitext” and prioritize it, while newcomers and GLAM staff find structured data much more “right” and intuitive (and would rather not be bothered with Wikitext). In our opinion, SDC is ‘the way forward’ for the Wikimedia movement at large; but quite some investments will be needed to make large-scale adoption a reality. The availability of batch contribution tools like OpenRefine is only a small piece of the puzzle.
  • We included a feature to help uploaders with basic SDC data modeling (schema templates). However, we’ve only been able to confidentially provide a few basic schema templates (faithful reproduction of an artwork, photograph of an artwork, book - all depending on a Wikidata item for the creative work). As mentioned above: for most types of Wikimedia Commons files, there are no strict data modeling conventions yet, and hence OpenRefine can also not provide corresponding schema templates that are based on consensus and broad community support. Examples include: prints, archaeological objects, specimens, and generally all cases where it doesn’t fully make sense to create a corresponding Wikidata item for the depicted object.

The future of batch functionalities in the Wikimedia ecosystem

edit
 
Memorial for the GLAMwiki Toolset, which served the Wikimedia movement from 2012 until 2022. Mini Hackathon in Utrecht, November 2022

It may be useful to follow up on this grant with in-depth conversations about the future of batch (mass) contribution tools on Wikimedia platforms. Batch uploading and batch editing are core needs and use cases on Wikimedia platforms, especially in the context of structured data and content partnerships with external partner organizations, but Wikimedia projects don’t offer these functionalities in the core platform. Since the 2030 Wikimedia Movement Strategy highlights the importance of content partnerships in the Strategic Direction, in the Movement Strategy Principles, and in the Innovate in Free Knowledge Strategy Recommendation: how does the Wikimedia community at large want to sustainably address and serve these needs in the longer term? Various questions and considerations may serve as a guidance here:

  • To what extent is it responsible and ethical to rely on free labor, and on volunteer developers, for such key features?
  • Who will take up responsibility for long term further development, maintenance…?
  • What are pros and cons of relying on external tool(s), versus in house development of batch contribution features on Wikimedia platforms? What are the benefits, pitfalls, and blockers to integrate and maintain such features within the general Wikimedia platforms / MediaWiki technical ecosystem?
    • Various approaches have now been tried: a Commons batch upload tool built as a MediaWiki extension (GLAMwiki Toolset), various fully external autonomous tools inside the Wikimedia tool ecosystem (Pattypan, QuickStatements...), and functionalities which are part of (the framework of) an external tool used by a broader but largely external community (OpenRefine). All have their pros and cons!

Part 2: The Grant

edit

Finances

edit

Actual spending

edit

Please copy and paste the completed table from your project finances page. Check that you’ve listed the actual expenditures compared with what was originally planned. If there are differences between the planned and actual use of funds, please use the column provided to explain them.

Expense Approved amount Actual funds spent Difference
Product Manager $26,800.00 ($20,000.00 during original grant application, $6,800.00 for extension) $26,727.31 $72.69
OpenRefine developer $62,400.00 ($42,000.00 during original grant application, $20,400.00 for extension) $67,640.00 -$5,240.00
Wikimedia developer $16,000.00 (during original grant application) $16,000.00 $0,00
Additional developer honoraria $4,500.00 (during original grant application) $4,048.04 $451.96
Technical mentorship $5,280.00 (for extension) $5,320.00 -$40.00
Design $10,560.00 (for extension) $6,400.00 $4,160.00
Meeting costs $1,500.00 (during original grant application) $1,663.91 -$163.91
Wire fees $900.00 ($500.00 during original grant application, $400.00 for extension) $140.70 $759.30
Project leadership, administration, accounting, strategic support $22,577.64 ($14,911.76 during original grant application, $7,665.88 for extension) $22,577.68 -$0.04
Total $150,517.64 ($99,411.76 during original grant application, $51,105.88 for extension) $150,517.64 $0.00


Rationale for large differences in spending for several budget lines:

Remaining funds

edit

Do you have any unspent funds from the grant? Please answer yes or no. If yes, list the amount you did not use and explain why.

No.

If you have unspent funds, they must be returned to WMF. Please see the instructions for returning unspent funds and indicate here if this is still in progress, or if this is already completed:

N/a

Documentation

edit

Did you send documentation of all expenses paid with grant funds to grantsadmin wikimedia.org, according to the guidelines here? Please answer yes or no. If no, include an explanation.

Yes. An overview of expenses can be found in this spreadsheet. Digital copies of invoices and receipts can be obtained from Code for Science and Society upon request; see email addresses provided in the linked spreadsheet.

Confirmation of project status

edit

Did you comply with the requirements specified by WMF in the grant agreement? Please answer yes or no.

Yes.

Is your project completed? Please answer yes or no.

Yes.

(As mentioned above: this project has delivered Wikimedia Commons batch editing and upload functionalities in OpenRefine. An official first release of OpenRefine 3.7 is underway; additional work on documentation and training will be welcome.)

Grantee reflection

edit

We’d love to hear any thoughts you have on what this project has meant to you, or how the experience of being a grantee has gone overall. Is there something that surprised you, or that you particularly enjoyed, or that you’ll do differently going forward as a result of the Project Grant experience? Please share it here!

(Reflection by Sandra Fauconnier, who was Product Manager of this project)

I have already listed many topics of reflection around the project and its context above, and will use this section to list some observations and thoughts around being a Wikimedia Project grantee specifically.

This was my first major Wikimedia Project grant. In the past, I have received and managed other grants from different sources; mainly grants in the Dutch cultural heritage field, usually for higher amounts than the one awarded here. I have also been a member of a Dutch cultural sector grants committee for 4 years. Additionally, while this specific project for OpenRefine was running, OpenRefine has also received (and was working on) a few other grants from other funders.

Compared to most other funders I’ve worked with before, the Wikimedia grants process includes a longer and more participatory/interactive application process, more strict requirements with regards to reporting (including monthly reports), and more strict rules around budgeting and spending (e.g. requirement to get permission for transferring parts of a budget between budget lines). In the context of the Wikimedia movement this makes sense: projects are run in and with the Wikimedia community, and benefit from visibility and transparency. That said, this process also causes additional work for the grantee. On average, I have probably spent 6-8 hours per month on grant-related work (mainly reporting), which was 15% to 20% of my allocated hours, time I could also have partly spent otherwise. While I notice that extensive reporting has been useful internally, and to help compile this final report, some of the required (monthly) reporting was a duplication of effort: I have also reported about this project in various community channels (mainly the This Month in GLAM newsletter). For visibility of funded projects, it may be an idea to encourage grantees to use such community channels indeed, perhaps, if relevant, replacing monthly reports here on Meta.