WikiCred/2022 CFP/CiteFix: a tool to fix broken references

English Wikipedia has several maintenance categories that are auto-populated based on errors in referencing and citation templates. There are thousands of pages in these categories that are left unattended for a long time. While some of these are easy to fix manually, some of them aren’t. Our idea is to develop a tool named “CiteFix” which will suggest changes to users to fix the errors.

CiteFix: a tool to fix broken references
A WikiCred 2022 Grant Proposal
Project TypeTechnology
AuthorJay Prakash
(Jayprakash12345)
Contact0freerunning@gmail.com
Requested amountUSD 10,000
Award amountUnknown
What is your idea?

In layman terms, the tool will scan through the pages in various defined categories, and suggest edits to fix for a user to take action. The interface will be displaying the preview and the edit space of both the current version, and suggested edit version. If the user feels that that the suggested edit is fine, they would be able to easily fix the error with a click. They will also have an option to edit the suggested edit space, in case the tool is not able to generate the suggestion properly. The user can also skip a certain suggestion if they are not sure if it is right or not. The tool will mark all its edits with a hashtag, which can be used to track the usage of the tool over time.

For this project, we will be focusing the following errors:

  • Cite error: There are <ref group=lower-alpha> tags or {{efn}} templates on this page, but the references will not show without a {{reflist|group=lower-alpha}} template or {{notelist}} template.
  • Help page: https://w.wiki/uLe
  • Example page: https://en.wikipedia.org/wiki/Education_in_India
  • Fix: The tool will suggest addition of {{reflist|group=lower-alpha}} along with a section heading if not already there at an appropriate position in the article.
  • Cite error: The named reference $1 was invoked but never defined.
  • Help page: https://w.wiki/uLi
  • Example page: https://en.wikipedia.org/wiki/Foundation_(cosmetics)
  • Fix: This error is largely generated in two ways - a.) an editor might have removed the citation while editing the content in one of the previous revisions, unaware that the citation is named and used multiple times in the article b.) an editor might have copied the content from another Wikipedia article in the same category. This happens especially in sports articles, when tables are listed. The tool will scan through the edit revisions, and pages (at depth 1) of categories of the page in questions. If it is able to find a reference named with a required name, it will suggest the full reference to be added to the current version of the article.
  • Cite error: The named reference "sofield" was defined multiple times with different content (see the help page).
  • Help page: https://w.wiki/uMX
  • Example: https://en.wikipedia.org/wiki/Interstate_95
  • Fix: The tool will suggest two options depending on the case. If the full reference text is same for both (or multiple), then the suggestion will be to remove full text, and inserting invoke syntax. If the full reference texts are different, the suggestion will be to rename one or more of the references in question.

Steps

We will approach the problem from a technical standpoint in the following way:

  • Helper library: While developing a large tool, it is very important to place every logic in an orderly manner, so that it doesn’t appear messy for others to follow the code. Modular approach also helps the tool to get newbie developers involved easily. So initially, we will create a helper functions library that will contain functions like getting wikitext of pages, identifying patterns for reference errors, generating suggestion diff of fixes, and saving text back to Wikipedia etc.
  • Web backend: After developing the helper library, it is important to integrate it with the backend framework. We will use the Flask framework for this tool because it is a lightweight framework and quite modular to integrate with other python libraries. Till now, we have developed 8 tools for the Wikimedia community using the Flask framework.
  • Integration with OAuth: Next big step is to integrate Wikimedia OAuth with the tool so that edits can be made by a user account, not anonymous users (IP).
  • Web frontend: We will create the frontend of this tool in the ReactJS framework. ReactJS is the second most used web frontend, after Google's AngularJS. ReactJS allows us to manage a lot of data as centrally by Redux Javascript library. It will let users stay on a single page and the browser will not refresh at all, as we will use centrally stored data. There are two benefits when browsers do not refresh:
  • Whole web app will load once and will not take extra bandwidth to download the web page again and again. The app will be able to run on slow connections.
  • It is very good for a better user experience.
  • Web frontend UI: We all know user interfaces play a vital role while working on something. Good user interface will encourage users to contribute more. We will use Material-UI as a UI Component for React App. Material design is dominating currently. Websites such as Gmail, Reddit, Dropbox, and Pinterest are using Material design. For a demo of material UI, please visit http://bodh.toolforge.org.
  • I18n of tool: Internationalisation tools in terms of interface language is a very important and part of the Wikimedia community best practices. We will integrate our tool’s i18n with https://translatewiki.net. TranslateWiki is run by the Wikimedia language team and allows Wikimedia community users to translate wiki interfaces in their own language. By the end of this project, we will have a tool interface in at least 30 languages.
  • Worker Queue: In the last step, the tool will be ready for English wikipedia but everyone knows the server's limited resources will serve limited users and it will not be scalable. To make this tool scalable, we will set up the 8 worker queues in the background which will have the capacity to serve more than thousands users at a time so that many users can use this tool at once. An example of a worker queue system is http://quarry.wmflabs.org, it serves very heavy requests easily by workers and users can run queries that only take minutes in their executions.
  • Deployment of the tool: We will use Wikimedia’s Cloud VPS as server hosting. Cloud VPS has many advantages over Toolforge. Unlike toolforge which is a shared hosting, Cloud VPS gives a separate server for tool deployment and we can customize our Cloud VPS as per our tool requirement. As it is free so we don’t need to pay anything for hosting and the tool will be there permanently.


Why is it important?

Though for experienced users, many of these errors might be an easy fix - it is not the case for new editors and readers of Wikipedia. Especially talking from the perspective of readers, these errors generate big red tags which do not create a good experience while reading. Also, for general readers, if someone wants to verify the source, they won’t be able to do it as an error text is displayed instead of full reference text.

There are at least ten thousand pages in all the above categories combined as of now, but the tool can be used to fix errors that can arise in the future as well. There is a high probability that new pages will be added to these error categories, as the citation templates are prone to errors, especially by new editors, and this can help them to easily fix things.


Link(s) to your resume or anything else (CV, GitHub, etc.) that may be relevant

https://docs.google.com/document/d/1INsMe4qTh1QYWf4IQciDK_rahlyqnq57RdjMDk-pfvE


Is your project already in progress?

n/a


How is this project relevant to credibility and Wikipedia?

According to WP:WHYCITE, “By citing sources for Wikipedia content, you enable users to verify that the information given is supported by reliable sources, thus improving the credibility of Wikipedia while showing that the content is not original research. You also help users find additional information on the subject; and by giving attribution you avoid plagiarising the source of your words or ideas.”

Every citation added, adds to credibility and verifiability of Wikipedia. However, it is not just important to add a citation, it should also be added the right way. If not, it fails the whole purpose as a general reader will not be able to access the source. In a community of thousands of volunteers, mistakes are inevitable. This tool helps to fix those mistakes in a simple, easy and engaging way - thereby contributing to the credibility of Wikipedia, a little at least.


What is the ultimate impact of this project?

We are envisioning the ultimate impact of the project in two categories. The first one is internationalisation - though for this stage of the project, we will be focusing on English Wikipedia only, we will ensure that the tool can be internationalised for other language Wikipedians, without a huge effort. If the tool is used on more language Wikipedias, the impact will be more significant, rather than on just Wikipedia. As we progress towards the end of the project, we will talk to at least 1-2 other Wikipedians if they’d like to adopt this, and see what’s possible, and also develop documentation for the same.

The second component is to use this tool in campaigns. A campaign focused to fix such basic issues can be a great help for newcomers, or it can also be used in campaigns such as 1lib1ref.


Can your project scale?

Yes, the answer to this mostly lies in the answer to the question “What is the ultimate impact of this project?” - scaling is mainly applicable to expansion to other languages.


Why are you the people to do it?

Jay Prakash is a seasoned developer who has created several tools and is working on MediaWiki since 2017 (his 500+ fixes has been merged), which can be seen from his GitHub and Phabricator profile. He is an expert in web applications & MediaWiki. Addinally Jay was Google Summer of Code Intern for Wikimedia Foundation in 2019.


What is the impact of your idea on diversity and inclusiveness of the Wikimedia movement?

Though it is a direct relation to the overall credibility of Wikipedia, which can bring in diversity in a lot of ways, and the tool can be used by people who are not tech-savvy, it is inclusive. Though there might be an unforeseen impact, we are not able to establish a direct connection on this issue.


What are the challenges associated with this project and how you will overcome them?
  • Integration with Wikimedia: Login by OAuth with Wikimedia projects for tools is not easy. We have to apply for app credentials on MetaWiki, then a steward will approve our application. After that we will use mwoauth python library in our web backend to do Open Authentication with Wikimedia projects.
  • Reference error pattern: The tool primarily works using pattern identification, and it can’t be done just with one expression or a function. A lot of patterns need to be identified. We will have a bunch of regex expressions, and this set will be constantly updated to identify new patterns. We will create a form for users to report any unrecognized patterns.
  • Mobile friendly: We will have this challenge as we worked on mobile friendly web apps very few times. We will use cascading stylesheet’s media-query which allows us to identify devices like Mobile, Tablet, and Desktop then we will style our app like margin, padding, box-model etc according to device and set breakpoints based on screen size.
  • Deployment of workers: Worker queue runs in the background so they need continuous monitoring. We will use Celery as our tool worker and will create some custom script that will check their status if script founds status negative it will restart the workers again.


How will you spend your funds?
  • Developer: USD 7,000 (200 hrs * $35/hr)
  • Product Manager: USD 3,000 (150 * $20/hr)


How long will your project take?

The following will be the broad timeline:

  • Month 0: Research and prep
  • Month 1: Development Sprint 1
  • Month 2: Phase 1 Development concludes
  • Month 3: User Testing Phase 1
  • Month 3: Development Sprint - 2
  • Month 4: User Testing Phase 2 & Campaign exploration
  • Month 5: Phase 2 (and Final) Development concludes
  • Month 6: Documentation and reporting


Have you worked on projects for previous grants before?

https://meta.wikimedia.org/wiki/Grants:Project/Jayprakash12345/Improve_documentation_of_MediaWiki_maintenance_scripts