Community health initiative/Quarterly updates

Quarterly updates for the Anti-Harassment Tools team.

Annual Plan objectives status - December 2017 edit

Summary

The annual plan was decided before the full team was even hired and is very eager and optimistic. Many of the objectives will not be achieved due to team velocity and newer prioritization. But we have still delivered some value and anticipate continued success over the next six months. 🎉

Over the past six months we've made some small improvements to AbuseFilter and AntiSpoof and are currently in development on the Interaction Timeline. We've also made progress on work not included in these objectives: some Mute features, as well as allowing users to restrict which user groups can send them direct emails.

Over the next six months we'll conduct a cross-wiki consultation about (and ultimately build) Blocking tools and improvements and will research, prototype, and prepare for development on a new Reporting system.

Outcome 1: Detection/Prevention

  • Objective 1: Implement performance, usability, and anti-spoof improvements to AbuseFilter so admins can craft filters to block words and strings that violate community policies.
    • Project documentation: Community health initiative/AbuseFilter
    • Status: AntiSpoof improvements are complete, we’ve decided to not make any usability improvements, and we’ve implemented additional performance monitoring.
    • Next steps: We're discussing on English Wikipedia's Edit filter noticeboard about how we can better surface the performance statistics/reports in the AbuseFilter interface. More likely than not we will only make a minimal change, if any change at all.
  • Objective 2: Implement stability improvements to ProcseeBot, a proxy white/blacklist, so admins can more effectively manage troublesome proxies.
    • Project documentation: Community health initiative/Blocking tools and improvements/Open proxies
    • Status: This objective in its current wording is invalid/rejected. After some more research we identified that stability improvements are not needed, but the opportunities lies in making ProcseeBot run on Meta to set global blocks. We've discussed this as a possibility with the bot's creator and a handful of other users.
    • Next steps: We are currently conducting a consultation about Blocking tools, which includes this topic.
  • Objective 3: Based on social interaction modeling research, build a prototype to surface potential cases of wikihounding to admins.

Outcome 2: Reporting

  • Objective 1: Build consensus in the English Wikipedia community for requirements and direction of the reporting system.
    • Project documentation: Community health initiative/Reporting system
    • Status: The initial prep work is starting now. This will be a significant focus for our team in 2018. We aim to begin development in July, but hope to have reached consensus for a direction by April and decide on the requirements and design by June.
    • Next steps: Keep researching and discussing. We are analyzing the survey results about English Wikipedia's current reporting system, which was run in cooperation with SuSa.
  • Objective 2: Build a prototype of the new reporting system to validate our concepts.

Outcome 3: Evaluation of harassment cases

  • Objective 1: Build a user interaction evaluation tool to streamline the workflow for admins who investigate the interaction history of users who report harassment.
  • Objective 2: Build consensus in the English Wikipedia community for requirements and direction of a private admin-only database for discussing and documenting on harassment incidents.

Outcome 4: Known bad actors are effectively prevented from causing further harm

  • Objective 1: Build Per-page blocking tool, which will help wiki administrators to redirect users who are being disruptive without completely blocking them from contributing to the projects.
    • Project documentation: Community health initiative/Per user page block, Community health initiative/Editing restrictions, Community health initiative/Blocking tools and improvements
    • Status: This feature receives strong support in the 2016 and 2017 Community Wishlists, but when we attempted to discuss it within the context of "productizing editing restrictions" support was lacking. We are currently discussing this in our cross-wiki blocking consultation under the context of "Full-site blocks are not always the appropriate response to some situations."
    • Next Steps: We've committed to begin development on a more granular type of blocking for Q3, which may be per-page blocking or something similar. We are waiting on the blocking consultation to complete before we decide.
  • Objective 2: Make global CheckUser tools work across projects so admins can check contributions across IPs on all Wikimedia projects in one query.
    • Project documentation: None yet
    • Status: Nothing has been done on this objective.
    • Next Steps: We may look into this as part of the blocking consultation, but more likely than not we will punt this to next FY.
  • Objective 3: Build consensus in the English Wikipedia community for requirements and direction of a sockpuppet blocking tool.
    • Project documentation: Community health initiative/Blocking tools and improvements
    • Status: This is also part of the blocking consultation, under the context of "Username or IP address blocks are easy to evade by sophisticated users" and "Aggressive blocks can accidentally prevent innocent good-faith bystanders from editing"
    • Next Steps: The leading idea so far is to give CheckUsers more information (potentially UserAgent.) We anticipate to build this in Q4.

Milestones/Targets

  • Milestone 1: Increase the confidence of admins with their ability to make accurate decisions in conduct disputes, measured via focus groups or consultations.
    • In September 2017 we ran an Administrator's Confidence Survey, asking several questions about tools and preparedness to make administrative decisions. Based on answers to the question “Wikipedia has provided me enough resources to solve, mitigate, or intervene in cases of harassment." our baseline is 35.9%. Full data here.
  • Milestone 2: Release three features which empower Wikimedia administrators to better enforce community policies and reduce abusive activity.
    1. Interaction Timeline
    2. Granular blocking tool
    3. Either an improved CheckUser block tool or the Wikihounding tool. We'll decide during Q4 planning.

Q2 objective status - December 2017 edit

Summary

We were a bit ambitious, but we're mostly on track for all our objectives. The Interaction Timeline is on track for a beta launch in January, the worldwide Blocking consultation has begun, and we've just wrapped some stronger email preferences.

We decided to stop development on from the AbuseFilter but are ready to enable ProcseeBot on Meta wiki if desired by the global community. We've also made strides in how we communicate on-wiki, which is vital to all our successes.

Objective 1: Increase the confidence of our admins for resolving disputes

  • Key Result 1.1: Allow wiki administrators to understand the sequence of interactions between two users so they can make an informed decision by building and releasing the Interaction Timeline.
  • Key Result 1.2: Consult with Wikimedians about shortcomings in MediaWiki’s current blocking functionality in order to determine which blocking tools (including sockpuppet, per-page, and edit throttling) our team should build in the coming quarters.

Objective 2: Reduce harassment by improving preventative solutions

  • Key Result 2.1: Allow users to control who communicate with them by allowing them to prohibit direct emails from certain user groups.
  • Key Result 2.2: Prevent more types of blatant harassment on English Wikipedia by improving AbuseFilter performance and working with the community to enable additional anti-harassment edit filters.
    • Project documentation: Community health initiative/AbuseFilter
    • Status: We have decided not to invest in the AbuseFilter any more. The Edit filter managers have access to our performance monitoring and we have offered to surface the data in the UI if there is strong desire, but this conversation is not lively.
  • Key Result 2.3: Decide if we need to make any changes to ProcseeBot, a tool that automatically blocks open proxies from editing Wikipedia, or if we want to port it to Meta to perform global proxy blocks, by working with the bot’s creator User:Slakr.

Objective 3: Improve how our team communicates with Wikimedians

  • Key Result 3.1: Encourage active, constructive participation between Wikimedians and the Anti-Harassment Tools team through the entire product development cycle (pre- and post-release) by establishing communication guidelines and cadence.
    • Status: On track! We've learned quite a bit over the course of our Timeline and Blocking consultations, as well as two surveys. We've developed pretty good working techniques for setting a welcoming stage, inviting people to participate, and holding constructive conversations when they arrive. We'll share our notes soon. ❤️

Q3 goals - December 2017 edit

Hello all! Now that the Interaction Timeline beta is out and we're working on the features to get it to a stable first version (see phab:T179607) our team has begun drafting our goals for the next three months, through the end of March 2018. Here's what we have so far:

  • Objective 1: Increase the confidence of our admins for resolving disputes
    • Key Result 1.1: Allow wiki administrators to understand the sequence of interactions between two users so they can make an informed decision by adding top-requested features to the Interaction Timeline.
    • Key Result 1.2: Allow admins to apply appropriate remedies in cases of harassment by beginning development on more granular types of blocking.
    • Key Result 1.3: Consult with Wikimedians about shortcomings in MediaWiki’s current blocking functionality in order to determine which improvements to existing blocks and new of blocking our team should implement in the first half of 2018.
  • Objective 2: Reports of harassment are higher quality while less burdensome on the reporter
    • Key Result 2.1: Begin research and community consultation on English Wikipedia for requirements and direction of the reporting system, for prototyping in Q4 and development in Q1 FY18-19.

Q1 summary/Q2 goals - October 2017 edit

Happy October, everyone! I'd like to share a quick summary of what the Anti-Harassment Tools team accomplished over the past quarter (and our first full quarter as a team!) as well as what's currently on the docket through December. Our Q1 goals and Q2 goals are on wiki, for those who don't want emoji and/or commentary.

Q1 summary

📊 Our primary metric for measuring our impact for this year is "admin confidence in resolving disputes." This quarter we defined it, measured it, and are discussing it on wiki. 69.2% of English Wikipedia admins report that they can recognize harassment, while only 39.3% believe they have the skills and tools to intervene or stop harassment and only 35.9% agree that Wikipedia has provided them with enough resources. There's definitely room for improvement!

🗣 We helped SuSa prepare a qualitative research methodology for evaluating Administrator Noticeboards on Wikipedia.

⏱ We added performance measurements for AbuseFilter and fixed several bugs. This work is continuing into Q2.

⚖️ We've begun on-wiki discussions about Interaction Timeline wireframes. This tool should make user conduct investigations faster and more accurate.

🤚 We've begun an on-wiki discussion about productizing per-page blocks and other ways to enforce editing restrictions. We're looking to build appropriate tools that keep rude yet productive users productive (but no longer rude.)

🤐 For Muting features, we've finished & released Notifications Mute to all wikis and Direct Email Mute to Meta Wiki, with plans to release to all wikis by the end of October.

Q2 goals

⚖️ Our primary project for the rest of the calendar year will be the Interaction Timeline feature. We plan to have a first version released before January.

🤚 Let's give them something to talk about: blocking! We are going to consult with Wikimedians about the shortcomings in MediaWiki’s current blocking functionality in order to determine which blocking tools (including sockpuppet, per-page, and edit throttling) our team should build in the coming quarters.

🤐 We'll decide, build, and release the ability for users to restrict which user groups can send them direct emails.

📊 Now that we know the actual performance impact of AbuseFilter, we are going to discuss raising the filter ceiling.

🤖 We're going to evaluate ProcseeBot, the cleverly named tool that blocks open proxies.

💬 Led by our Community Advocate Sydney Poore, we want to establish communication guidelines and cadence which encourage active, constructive participation between Wikimedians and the Anti-Harassment Tools team through the entire product development cycle (pre- and post-release.)

Q1 goals - July 2017 edit

I have two updates to share about the WMF’s Anti-Harassment Tools team. The first (and certainly the most exciting!) is that our team is fully staffed to five people. Our developers, David and Dayllan, joined over the past month. You can read about our backgrounds here.

We’re all excited to start building some software to help you better facilitate dispute resolution. Our second update is that we have set our quarterly goals for the months of July-September 2017 at mw:Wikimedia Audiences/2017-18 Q1 Goals#Community Tech. Highlights include: