Grants:IdeaLab/Automated Quality assessment

Automated Quality assessment
use revscore to update quality workflow, and provide feedback on new editor work.
idea creator
Slowking4
advisor
Fuzheado
this project needs...
volunteer
developer
designer
community organizer
researcher
join
endorse
created on16:18, 3 March 2016 (UTC)


Project idea edit

What is the problem you're trying to solve? edit

 
English Wikipedia article assessment progress, June 2011
a chart of assessments, but not accuracy of assessment

quality assessment of articles is moribund; it tends to be a one off by a wikiproject, of historical interest.

What is your solution? edit

automated or semi-automated assessment, in near real time would provide work progress feedback.

we experimented at an editathon, with using the revscore tool to be copied into a table of articles on a to do list.

we found that the quality assessment on article talk was not accurate.

m:Objective Revision Evaluation Service

w:Wikipedia:Meetup/DC/SAAM_AU_2016#Harlem_Renaissance_artists

GLAM/Newsletter/February 2016/Contents/USA report

Goals edit

if we had a semi-automated dashboard for wikiprojects or editathons or educators, we could manage editing progress better, than merely looking at edit count.

a semi automated tool could run revscore and revise talk page assessment. good or featured assessment could be rounded down to A and entered into promotion process.

new editors don't receive good feedback about their editing, metrics tend to be edit count or bytes only. but providing a change in rev score improvement, near real time feedback can be provided.

this could be migrated to automated updates of article talk, and feed articles into good and feature article assessment queues. editathon attendees, education assignment editors could receive automated talk page feedback about rev score quality improvement.

Get Involved edit

About the idea creator edit

WMDC

Participants edit

  • Advisor Provide feedback from existing experiments Fuzheado (talk) 17:16, 15 March 2016 (UTC)
  • Willing to help Smallbones (talk) 20:33, 22 March 2016 (UTC)

Endorsements edit

  • Quality assessment needs improvement Thennicke (talk) 00:35, 15 March 2016 (UTC)
  • Fuzheado (talk) 14:23, 15 March 2016 (UTC) - As one of the folks doing the most in this area so far
  • It would also reward the efforts of the contributors and hence motivate them. The RedBurn (talk) 17:09, 15 March 2016 (UTC)
  • ORES' article quality model has shown to be relatively robust to this use case. Since the model can already be applied in realtime, it should be easy to experiment with visualizations and bots/UIs that surface quality scores. --Halfak (WMF) (talk) 18:19, 15 March 2016 (UTC)
  • Some initial thoughts on a similar project, WikiRate, that was proposed in 2014 (before the existence of ORES) can be found on the WMUK website. The talk page includes the results of a literature search, which may be of some interest. MichaelMaggs (talk) 21:55, 15 March 2016 (UTC)
  • endorse - I've done a few things in this area already. Smallbones (talk) 20:34, 22 March 2016 (UTC)
  • Endorse but this will never happen. Quality assessments need a lot of work and I did a ton of it but you are going to see a number of problems implementing something like this. First, not all WikiProjects grade the same, so unless you are going to force standardize assessment criteria (good luck) projects or individuals are going to fight this. Also, Quality assessment tagging is largely viewed by some of the community as a waste of time. So if you do it, it's used against you when applying for advanced rights because it's non mainspace. Reguyla (talk) 19:36, 13 April 2016 (UTC)

Expand your idea edit

Would a grant from the Wikimedia Foundation help make your idea happen? You can expand this idea into a grant proposal.

Expand into an Individual Engagement Grant
Expand into a Project and Event Grant