User reporting system consultation 2019
The Wikimedia Foundation's Community health initiative plans to design and build a new 用戶回報系統 to make it easier for people experiencing harassment and other forms of abuse to provide accurate information to the appropriate channel for action to be taken.
A multi-phase user reporting system consultation is happening from February through June 2019. The consultation will gather input from as many different parts of the Wikimedia community as possible—from different types of Wikimedia wikis projects, from people with a variety of different roles, and from people living in different cultures and geographic locations.
It is particularly important to obtain multiple perspectives from the people who have been targets of harassment, people accused, and the people in the Wikimedia movement who currently are handling reports of harassment and abuse.
Input will be taken in a variety of different methods including on wiki comments, by private email, through user interviews, and with live focus groups.
The high level goal of this project is to build a new harassment reporting system that produces higher quality reports that can be successfully processed and does not further alienate victims of harassment.
Expected outcome of the User reporting system consultation 2019
By the end of this consultation, the Community health initiative will have an overall direction for a single feature or group of features that a software development team will be able to build starting in July 2019. We'll have a rough agreement with content contributors and movement organizers across multiple languages of Wikimedia projects about the direction the project will take.
The result will not be a complete, detailed product specification or the introduction of a fully formed new policy, processes or workflows. Detailed plans will be developed over time with continued consultation with the community of users.
By the last phase of the consultation, we'll be able to answer these questions:
- What are the the essential components of a user reporting system? Which of these are missing from current method of handling cases of harassment?
- Will we build one feature, or more than one?
- Will some aspects of the user reporting that we build be public and other aspects be private?
- What are the important open questions that the product team should investigate and test prior to starting development?
Our process has four overlapping parts:
- Research Wikimedia Foundation researcher Claudia Lo will lead research about current reporting workflows on Wikimedia projects and do a comparative analysis of user-to-volunteer reporting systems of online platforms, including Wikipedia. Additional research topics will emerge from on wiki discussions.
- Discuss Research will help inform and compliment the global community wide consultation about harassment reporting.
- Design When there is a solid understanding of what tool(s) or system(s) that are needed we will create several designs or prototypes to illustrate and test the leading ideas. We'll share these publicly for feedback to determine the best possible solutions.
- Build or assemble When we're confident that we've identified the best possible ideas to allow users to report other users for abusive behavior, our software development team will build the new tools. The Community health initiative will work with the community to advance any social or cultural type changes.
|Phase 0 - Planning & Research||July 2018 - February 2019||Gathering preexisting consultations, doing preliminary research, cross Wikimedia Foundation department coordination.|
|Phase 1- Community Consultation; Set framework & call to participate||February 15 - March 15||Getting the word out, form focus groups, identify volunteer liaisons for wikis and affiliates, offer existing research for review, explain scope of the project and decision making, solicit feedback about the process|
|Phase 2- Community Consultation; Options||April 1 - April 30 2019||Gather and present to the community the existing options, collect new ideas, solicit feedback. Create a list potential options for more in depth legal & technical analysis and focus group review.|
|Phase 2.5 Analysis and winnowing of options||May 1- June 30||Focus group, User interviews, business case, prioritization|
|Phase 3 - Community Consultation; Summary of options and prioritization||July 1||Present a direction forward with the product(s) prioritized for development by Anti-Harassment Tools team and the next steps for other non technical solutions.|
|Phase 4 - Transition to software development and reorg community policy or processes||July 2019 - ?||Transition to software development on Phabricator and Meta Community health initiative pages project pages. Further explore the role of volunteers working with in the system and how reports can escalate to different levels.|
Give feedback about the consultation, design and development process
Please read this page and provide feedback on the overall process for this User reporting system consultation. Tell us what's missing or unclear. Suggest more ways to include more parts of the on wiki community and the larger Wikimedia movement.
To make a good decision about a new user reporting system it is essential to listen and learn from you and the rest of the Wikimedia movement. We are genuinely seeking to understand and take into account the feedback we receive. However, the final decision about the type of software to support and Trust and Safety approaches will be made by the Wikimedia Foundation, after consideration of all the available information and the Wikimedia movement's 2030 strategic direction.
Local wiki self-governance
Wikimedia Foundation Trust & Safety
There are some rare instances when Wikimedia Foundation Trust and Safety will take actions to protect the safety of the community and the public. This happens where actions on local community governance level are either insufficient or not possible. There may be some rare cases where the Wikimedia Foundation must override local policy, such as to protect the safety of the Wikimedia communities or the public. See Trust & Safety Office actions for more details.
Approach to decision making
There may be difficult decisions to make about about which software features to prioritize and how to allocate resources toward other aspects of a user reporting system. When it comes time to make a decision, all valid options will be weighed by the following criteria:
- Which option(s) most aligns with Wikimedia movement values?
- Which option(s) is most in alignment with Strategic Direction of Knowledge Equity?
- Which option(s) most aligns with the goal "to build a new harassment reporting system that produces higher quality reports that can be successfully processed and does not further alienate victims of harassment."
- Which option will result in more accessible user experience, for anyone on any device?
- Which option will result in a more sustainable product that will be resilient to changing technologies, evolving use cases, and user expectations?
- Which option(s) do not introduce undue risk for achieving our project goals?
Help us spread the word
The success of this project depends on collecting ideas and feedback from people with a variety of different roles in the Wikimedia movement.
- Translate this consultation page.
- Sign up to volunteer to share information about this consultation on your wiki or to your Wikimedia affiliate organization.
Join a focus group
In order to collect ideas and feedback from people with a variety of roles and experiences in the Wikimedia movement, a focus group(s) will be formed.
Learn more about research on user reporting systems
The Community health initiative, lead by Wikimedia Foundation researcher Claudia Lo, is conducting research projects related to user reporting systems.
The findings generated from this research will guide this and ongoing community consultations, inform product decisions, as well as provide a more thorough understanding for wiki communities about how reporting systems operate compared to other Wikimedia communities and other online platforms.
Enwiki Reporting system summary
It is available as a summary report, with an accompanying diagram of formal reporting spaces on English Wikipedia.
This project provides a summary of existing reporting systems, both formal and informal, that exist on English Wikipedia. It also defines some key terms for future conversations in this topic, such as involved users, the difference between formal and informal systems, and language for discussing broader concerns of community values and issues of labour. We find that some key issues with the current reporting system include its difficulty of access, eroded trust in current systems, the unstructured nature of noticeboards, and the clash between a desire for transparency and concerns over safety and privacy. We also find that informal systems, relying on existing relationships between editors, serve as an ad hoc private reporting system, as well as a complimentary structure to formal systems that can guide reporters through the often-tricky process of going through the official reporting procedures.
User-to-volunteer reporting system rubric
This project took place over February 2019, and produced both a summary report, and an accompanying detailed report.
On Wikipedia, most content and conduct disputes are handled by groups of volunteers. Accordingly, reports of such disputes are first routed to them, and only in cases of immediate danger or outsized harm do reports bypass this volunteer system and go directly to the Foundation’s Trust & Safety team. At this stage in our ongoing project on creating a reporting systems for Wikimedia projects, we could learn from investigating peer platforms’ implementation of private reporting systems. Not only could this comparison provide interesting design examples for us, new editors' expectations for reporting systems are being shaped by the reporting systems they encounter elsewhere online.
Though many platforms online incorporate some form of reporting system, these typically channel user reports directly to an in-house or contracted team of employees. This makes them most analogous to the use of the emergency wikimedia.org reporting channel. However, this makes them different to the types of reporting systems to which our community is accustomed, and thus the types of reporting systems we will be expected to use as a basis for design.
By conducting a review of existing best practices documents and research on this subject, we can create an assessment rubric to evaluate private peer-to-volunteer reporting systems. Some of the most prominent platforms using such a system include Reddit and Facebook Groups. We can run these platforms through this rubric, and additionally compare the current state of Wikipedia’s reporting systems, for a comparative understanding of these mechanisms.
The Wikimedia Foundation's Anti-Harassment Tools team wants to better understand these existing systems to identify any pain points or shortcomings that we can address with improved software. Our research will heavily focus on English Wikipedia as the largest wiki community but we aim to build a tool that can be used by any wiki of any size. Research completed to date includes:
- The 2017 Community Insights Survey results, in which 84% of 300 users requested better reporting tools and 77% requested better noticeboards.
- Overview of research about English Wikipedia dispute resolution and harassment — Summary of research performed by the Wikimedia Foundation's Anti-Harassment Tools and Trust and Safety teams.
- Administrators' Noticeboard/Incidents Experience Survey — Survey about user experience with Administrators' Noticeboard/Incidents
- English Wikipedia's Administrators’ Noticeboard/Incidents quantitative research — A quantitative data analysis of posts to AN/I.
- Harvard Negotiation and Mediation Clinical Program report — Recommendations on the Development of Anti-Harassment Tools and Behavioural Dispute Resolution Systems for Wikimedia
- Notes from Wikimania 2018 roundtable — Anonymized raw notes from the "Building a Better Harassment Reporting System" roundtable.
- IdeaLab submissions — A summary of the IdeaLab grant submissions related to reporting harassment on Wikimedia
- Enwiki Reporting system summary — A summary report of existing reporting systems on English Wikipedia and a definition of some useful terms on the topic. Comes with an accompanying reporting workflow diagram.