Research talk:Autoconfirmed article creation trial/Work log/2017-09-11

Monday, September 11, 2017

edit

Today I'll be writing up a short analysis I did of unreview events.

Unreview events

edit

We've previously identified a delay in review time as a possible heuristic for identifying Time-Consuming Judgement Calls (ref the Aug 31 work log). However, we'd also be interested in finding other ways of identifying them. Maybe an article being marked as "reviewed" followed by being marked as "unreviewed" is one way? I decided to look into that.

I decided to only use "reviewed" and "unreviewed" pairs from the PageTriage extension. This means that if there are data points where a patroller marked the article as "reviewed" using a different tool, followed by someone marking it as "unreviewed" using the PageTriage extension, we will not be catching those. We'll leave that as future work for the moment.

There are 12,058 observations in our dataset, which does not account for a lot of the total review activity. Across the same time span, PT logged 633,036 reviews, meaning we are looking at 1.9% of that. It spans almost five years, meaning the average number of unreview actions per day is fairly low. If we ballpark the number of days to 1700 (roughly five years minus three months), it comes out to seven events per day.

A key part of this statistic is that the unreview event is almost always done by the same user, 91.4% of the events have this property. They also do this very quickly, 89.6% of them are done less than 15 minutes after marking the article as reviewed, something which is also clear from looking at the distribution of time to "unreviewed" for self-unreview events:

 

One interesting thing to note is that 13.7% of the time we have a triplet of "reviewed" then "unreviewed" then "reviewed" (all by the same user), where there's less than 1 hour between each event. Given that unreviews happen quickly, perhaps there's a signal here that if it's not marked as "reviewed" by the same user again later, then it might give us a strong signal that it's a TCJC?

There are not a lot of unreview events done by other users, in our dataset the peak is at ten:

 

However, if an article is unreviewed by a different user, it tends to happen more slowly:

 

Because the X-axis in the plot above is log-scale, it is worth noting that the peak is around an hour, and that most of these appear to happen within a day.

One thing to consider moving forward is the scale. In the time since PageTriage was introduced, there's been about 1.7 million articles created. Some of those are created by autopatrolled users, something I'll know more about shortly. Say it's 10% or so, that gives is 1.5 million articles that required patrolling. Of those, 20% are reviewed in more than a week, that's 300,000 articles. Our 12,000 unreview events are then about 4% of those. Can we find measures that capture a larger proportion?

Proportion of articles created by autopatrolled users

edit

What proportion of all article creations are made by users who have the "autopatrol" right, meaning their article doesn't need review?

We use two datasets to measure this. Both are grabbed from the data lake using the same underlying query, which is our best query for getting data on article created. This query is documented for H9. One version of the query identifies articles created by users who are not in the "autoreviewer", "bot", or "sysop" user groups, which are the groups that have the "autopatrol" right. In other words, it identifies articles created by non-autopatrolled users. From there, it is straightforward to calculate the proportion of autopatrolled article creations. Calculated per day and plotted from Jan 1, 2009 to July 1, 2017, it looks like this:

 
Return to "Autoconfirmed article creation trial/Work log/2017-09-11" page.