Wikimedia Foundation Annual Plan/2017-2018/Final/Community Health
Community Health
editTeams: Community Tech, Support and Safety, Research
Program leads: Trevor Bolliger (Community Tech), Patrick Earley (Support and Safety), Dario Taraborelli (Research)
- Strategic priorities
This program directly answers Community approach #1 in the Strategic Priorities: “Reduce harassment and the gender gap to facilitate a safe, welcoming, and supportive environment for contributors and editors.”
Timeframe: 28 months
More information: Community health initiative
Description | FY17-18 Plan | |||
---|---|---|---|---|
Community Health | Description of Community Health Expenses | |||
Staffing Expenses | 849 | 6.27 FTE in the Product and Community Engagement departments | ||
Non Staffing Expenses | - | |||
Data Center Expenses | - | |||
Grants | - | |||
Endowment Contribution | - | |||
Donation Processing Fees | - | |||
Outside Contract Services | - | |||
Legal Fees | 150 | Outside legal fees related to bringing anti-harassment legal claims and taking action against disruptions of the projects | ||
Travel & Conferences | 22 | Travel to community events and community conferences | ||
Other expenses (Wikidata, Personal Property Taxes) | 7 | Additional payroll fees and personnel related expenses not captured in "Staffing Expenses" | ||
Total Program Expenses | 1,028 |
Summary
editWhile our projects’ policies, procedures, tools, and community norms have evolved over time, there is still some improvement needed to address the problem of harassment and intimidation of contributors. The Foundation’s 2015 Harassment Survey found that a significant number of editors who either experienced or directly witnessed harassment on our sites were discouraged from contributing by the experience, stunting the growth of our projects. There is general agreement that both the processes and the tools used to combat this problem could significantly benefit from improvement.
The Community Health Initiative brings the Support and Safety and Community Tech teams together to address this multifaceted problem. The program has two interconnected components. The Support and Safety team will work with the community to review and improve policies and procedures, and the Community Tech team will research and build new Anti-Harassment Tools to help volunteers better put those policies into practice. (Note: this initiative is a special project funded by Newmark and does not include all the work that the Support and Safety team is doing to address community harassment.)
Goal
editThe Community Health Initiative aims to reduce the amount of harassing behavior occurring on Wikimedia projects and increase the resolution rate and accuracy of conduct dispute cases that do arise.
- The Community Tech team’s Anti-Harassment Tools project has four focus areas: Detection/Prevention, Reporting, Evaluation of harassment cases, and Blocking.
- The Support and Safety team’s Policies and Enforcement Growth project will examine strengths and weaknesses in existing community behavioral policies and enforcement processes, then translate these findings into structural improvements and pilots of new approaches.
- The Research team's Anti-Harassment Research project will aim to understand and model the characteristics of harassment in Wikimedia projects in order to inform the development of anti-harassment tools and recommendations for community-specific behavioral policies and enforcement processes.
Segment 1: Anti-harassment tools
editLead team: Community Tech
Outcomes and objectives
editOutcome 1: Detection/Prevention
editBlatant harassment is more easily identified and prevented.
To achieve this, we will extend existing tools used by the largest communities to expand their capabilities and improve their accuracy. We will also research potential features that identify harassment incidents before they boil over.
As a result of this work, community leaders will be better equipped to prevent potential instances of harassment.
- Objective 1: Implement performance, usability, and anti-spoof improvements to AbuseFilter so admins can craft filters to block words and strings that violate community policies.
- Objective 2: Implement stability improvements to ProcseeBot, a proxy white/blacklist, so admins can more effectively manage troublesome proxies.
- Objective 3: Based on social interaction modeling research, build a prototype to surface potential cases of wikihounding to admins.
Outcome 2: Reporting
editIncidents of harassment that requires assistance are accurately reported.
To achieve this, we will design and prototype a more accessible, less chaotic and less stressful system that helps targets of harassment to reach out for assistance, in partnership with the enwiki community.
As a result of this work, contributors will feel more comfortable reporting harassment incidents, which will help them to get the support that they need.
- Objective 1: Build consensus in the English Wikipedia community for requirements and direction of the reporting system.
- Objective 2: Build a prototype of the new reporting system to validate our concepts.
Outcome 3: Evaluation of harassment cases
editAdmins can make confident and correct decisions in conduct disputes, with a less burdensome time commitment.
To achieve this, we will research and build features that facilitate the workflows of administrators who research and respond to reports of harassment, potentially creating a private space to document and discuss.
As a result, more administrators should feel empowered to engage with conduct dispute resolution and feel confident about their mediation decisions.
- Objective 1: Build a user interaction evaluation tool to streamline the workflow for admins who investigate the interaction history of users who report harassment.
- Objective 2: Build consensus in the English Wikipedia community for requirements and direction of a private admin-only database for discussing and documenting on harassment incidents.
Outcome 4: Known bad actors are effectively prevented from causing further harm
editTo achieve this, we will both improve the effectiveness of existing blocking tools and introduce new features to provide admins with additional options in how to respond to bad actors.
As a result, repeat offenses of harassment are eliminated at the lowest possible cost to admin workload and wiki contributions.
- Objective 1: Build Per-page blocking tool, which will help wiki administrators to redirect users who are being disruptive without completely blocking them from contributing to the projects.
- Objective 2: Make global CheckUser tools work across projects so admins can check contributions across IPs on all Wikimedia projects in one query.
- Objective 3: Build consensus in the English Wikipedia community for requirements and direction of a sockpuppet blocking tool.
Milestones/Targets
edit- Milestone 1: Increase the confidence of admins with their ability to make accurate decisions in conduct disputes, measured via focus groups or consultations.
- Milestone 2: Release three features which empower Wikimedia administrators to better enforce community policies and reduce abusive activity.
Segment 2: Policy and Enforcement Growth
editLead team: Support and Safety
Timeframe: 27 months. 18 months for first phase (Objective 1: 12 months starting 17-18 Q1, Objective 2: 9 months starting 17-18 Q3). Phase 2 pilots will launch beginning Q3 18-19, with community consultation to select approach by the beginning of Q2 19-20.
Outcomes
editOutcome 1: Support and Safety will fulfill ongoing community requests for information about how harassment is handled on our projects.
editThis is in addition to supporting as liaison the work of Segment 1 above. Beginning with English Wikipedia, a large community from which can be obtained a wealth of data, we will provide contributors with research and analysis of how behavioural issues on English Wikipedia are a) covered in policy, and b) enforced in the community, particularly noticeboards where problems are discussed and actioned. We will provide research on alternate forms of addressing specific issues, researching effectiveness, and identifying different approaches that have found success on other Wikimedia projects. This will help our communities make informed changes to existing practices.
Outcome 2: In phase 2, we will lead pilot trials of alternate approaches identified through Outcome 1
editAfter analyzing the results of trials and possible paths forward to volunteer communities, we will help communities move forward with desired improvements. The goal is to not only present possible approaches, but to help English Wikipedia choose one, through community consensus, that will help community members effectively reach out for assistance with problems and effectively aid those who so reach out - at the same time that we are developing and deploying more effective tools for enforcing policies.
Objectives
edit- Objective 1: Using an approach similar to Peter Coombe’s research on help pages, survey the participants at the Administrator’s Noticeboard for Incidents on their experience – namely, the perceived effectiveness of the process, and the quality of outcomes. A second, quantitative approach will classify types of reports historically posted to gain an understanding of how the board is being used.
- Objective 2: Conduct a broad overview research project on trends, strengths, and challenges to forming effective behavioural policies on three major projects. '
Milestones/Target
edit- Milestone 1: Survey and analysis in Objective 1 completed, published and summarized by end of Q4.
- Milestone 2: Scoping of broad policy research project goals in Objective 2 completed by end of Q3; implementation by Q1 of FY 18-19, research finalized and results released on Meta and workshopped at community conference by end of FY 18-19 Q2.
Cross-Department Program: Privacy & Security
As technological and legal circumstances evolve, we are continuing our work to maintain and improve the Wikimedia Foundation's privacy and security practices in order to protect Wikimedia community member and donor information and ensure safe and secure connection to Wikimedia projects and sites.
Segment 3: Research on harassment
editLead team: Research
Strategic priorities: Communities
Timeframe: There may be research required during FY18-19 and beyond, but we are including only 12 months in this current annual plan for FY17-18.
Summary
editOver the past year, the Research team – in collaboration with other departments at the Wikimedia Foundation and external research collaborators – has developed a set of initiatives aiming to understand the prevalence of harassment and personal attacks; design algorithms to facilitate their detection; and understand the impact of toxic behavior on editor retention. We have also released large data sets and open source tools to support more open and reproducible research on harassment.
Goal
editIn the next fiscal year, we will continue research work on harassment and abuse detection and deliver empirical insights and new models to better characterize toxicity in Wikimedia discussion spaces.
Outcome
edit- Outcome 1: We aim to understand and model the characteristics of harassment in Wikimedia projects. This will primarily focus on quantitative research. We see great potential in using past contribution behavior to build models to identify additional forms of toxic behavior. These learnings will allow us to design better tools (Segment 1) and work with the community to evolve their policies (Segment 2.)
Objectives
edit- Objective 1: Conduct research to characterize and model wikihounding.
- Objective 2: Prototype new models to facilitate sockpuppet detection and the classification of toxic discussions.
Segment 4: Community Health Legal Defense Fund
editTeams: Community Engagement (Support and Safety, Community Tech, & Research), Legal
Strategic priorities: Communities
Time frame: 27 months; the legal segment will be ongoing once established.
Summary
editWe are building a more advanced system to reduce harassing behavior on Wikipedia and block harassers from the site. We will build new tools that more quickly identify potentially harassing behavior, and help volunteer wiki administrators evaluate harassment reports and make good decisions. Paired with existing tools that we’re improving and redesigning, this new system will streamline the way we combat “trolling,” “doxxing,” and other menacing conduct on Wikipedia
Goal
editBy providing the volunteer community and Wikimedia Foundation staff with robust tools for detecting, reporting and evaluating harassment, we will enable them to patrol Wikimedia communities more effectively and be able to identify and block harassing users. This in turn will create stronger, more effective, inclusive and diverse communities, benefitting current and future editors alike. We won’t eliminate harassment 100 percent, but we can pare it way down with a more streamlined approach.
Outcome and objectives
edit- Outcome 1: As appropriate and to the extent possible, the Wikimedia Foundation will support users who face significant harassment, and work to keep harassers away from Wikimedia sites.
- Objective 1: Provide funding for community members to bring anti-harassment legal claims, as appropriate
- Objective 2: Take direct legal action against users broadly disrupting the project, as appropriate