Updates edit

Auditing Report Tools edit

The Wikimedia Foundation’s Anti-Harassment Tools Team is starting research on ways reports are made about harassment used across the internet, while also focusing on Wikimedia projects. We are planning to do 4 major audits.

Our first audit is focusing on reporting for English Wikipedia. We found 12 different ways editors can report. We then divided these into two groups, on-wiki and off wiki reporting. On-wiki reporting tends to be incredibly public, off wiki reporting is more private. We’ve decided to focus on 4(ish) spaces for reporting that we’ve broken into two buckets, ‘official ways of reporting’ and ‘unofficial ways of reporting.’

Official Ways of Reporting (all are maintained by groups of volunteers, some are more adhoc volunteers e.g. AN/I)

  • Noticeboards: 3rr, AN/I, AN
  • OTRS
  • Arb Com Email Listserv
    • We’ve already started user interviews with arb com

Unofficial Ways of Reporting:

  • Highly followed talk page (such as Jimmy Wales)

Audit 2 focuses on Wikimedia projects such as Wikidata, Meta and Wikimedia Commons. Audit 3 will focus on other open source companies and projects like Creative Commons and Github. Audit 4 will focus on social media companies and their reporting tools, such as Twitter, Facebook, etc. We will be focusing on how these companies interact with English speaking communities and their policies for on English speaking communities, specifically because policies differ country to country.

Auditing Step by Step Plan:

  1. Initial audit
  2. Write up of findings and present to community
    • This will include design artifacts like user journeys
  3. On-wiki discussion
  4. Synthesize discussion
    • Takeaways, bullet points, feedback then posted to wiki for on-wiki discussion
  5. Move forward to next audit
    • Parameters for the audit given to us from the community and the technical/product parameters

We are looking for feedback from the community on this plan. We anticipate to gain a deeper level of understanding about the current workflows on Wikimedia sites so we can begin identifying bottlenecks and other potential areas for improvement. We are focusing on what works for Wikimedians while understanding on what other standards or ways of reporting are also out in the world.

--CSinders (WMF) (talk) 17:16, 2 March 2018 (UTC)[reply]

February 20, 2018 edit

Today we had a meeting to discuss research related to the harassment reporting tool; we specifically covered auditing current harassment and conflict workflows. Below are notes we took during the meeting discussing different ways to approach this research. We are looking for feedback!

  • How will we create a taxonomy about comparing these?
    • ‘Official’ vs unofficial reporting methods
      • Official is something like AN/I which has a specific process, and is considered a part of policy or institutionalized reporting for EN:Wikipedia
      • 'Unofficial' is something like a posting on a talk page; it's a great place to start conflict mitigation but it's not considered reporting.
    • On-wiki vs. off wiki
      • Noticeboards vs emailing
    • Harassment vs Conflict
  • Level of detail for each workflow
    • Small handful (max 5) of detailed write-ups
    • Bullet list of other places where user misconduct is reported (maybe with some explanation)
  • What artfacts do we want to generate? (drawings, text, screenshots)
    • 5 User journeys — step-by-step narrative with illustrations/screenshots
    • On-wiki
    • All other Less important/official/used workflows will just be simple bullet lists
  • What do we need?
    • Product manager: Documentation of existing workflows to have a standard (yet living) artefact that a conversation can revolve around
    • Caroline Sinders, Design researcher: What is the scope of conflict mitigation. General discovery to highlight characteristics of these products. Gain understand what does/doesn’t work. Where do things get lost?
    • Community advocate:  Overview of current process that show deep knowledge of community’s current method. Wiki processes and policy.
  • Conflict vs. harassment — How much of non-harassment (yet conflict) workflows do we want to audit
    • Commonly used terms instead of harassment (misconduct, conflict, etc.)
    • We’ll need to be open to a larger discussion about user conflicts, specifically ways to differentiate between 'conflict', ‘harassment’, and 'abuse.' We take all of those seriously but they are different kinds of actions and some will warrant different kinds of responses.
  • Small scale vs. large scale harassment — What are we trying to solve
    • Trevor Bolliger: Small scale. “When a user wants to inform another user that they have been harassed, how do they do this?”

February 8, 2018 edit

Collating past research and blowing out more robust and specific work flows from the above "Current workflows for reporting harassment." This is serving as an exploration and first initial 'reporting' audit, which is like a product audit. This entails listing out all of the steps in reporting harassment; for example, posting on AN/I would have something like "there's a problem between editors, one editor writes to another editor's talk page, problem persists for some time, that same editor posts to the other editor's talk page about bringing this problem to AN/I, .... ......." and so on and so forth. Some of this 'audit' may be word heavy, or it may feature diagrams, or workflows. However, it's designed to help illustrate how many steps exist in each method of reporting, highlight which of aspects of those reporting workflows use policy, rules, human infrastructure (so talking to each other), technology (that could be like an email listserve or a talk page), etc. All of these findings will be shared with the community and open to feedback and thoughts.