Community health initiative/User reporting system consultation 2019/ja

This page is a translated version of the page Community health initiative/User reporting system consultation 2019 and the translation is 35% complete.

User reporting system consultation 2019

ようこそ!


ウィキメディア財団のコミュニティの健全性イニシアティブでは、デザイン刷新と構成変更により新しく 利用者の報告システム を作成する計画です。目的は嫌がらせ行為ほか形の違う虐待をこうむっている当事者から、適した情報を受け取り対策に当たる当該部署へ渡すようにすることです。

利用者が嫌がらせ行為を報告する方法について考える

複数の段階を経る 利用者報告システムに関する協議は 開催時期が 2019年2月から6月です。 ウィキメディアのコミュニティから協議により、できるだけ多くの多彩な集団から — ウィキプロジェクト群内、分担する役割ごと、暮らしている文化圏や地理的な分布を超えてご意見を集めようとしています。

嫌がらせ行為の標的にされたり指弾されたり、あるいはウィキメディア運動内で現状の嫌がらせ行為や虐待の報告を取り扱う人々から、複層的な視野の提供を受けることが重視されます。

ご意見の受領にはオンウィキのコメントのほか、非公開のメール、対面式の聞き取りなどさまざまな方式を用い、また、この分野に特化したグループと連携します。

目標

The high level goal of this project is to build a new harassment reporting system that produces higher quality reports that can be successfully processed and does not further alienate victims of harassment.

2019年利用者の報告システムに関する協議に期待されること

By the end of this consultation, the Community health initiative will have an overall direction for a single feature or group of features that a software development team will be able to build starting in July 2019. We'll have a rough agreement with content contributors and movement organizers across multiple languages of Wikimedia projects about the direction the project will take.

The result will not be a complete, detailed product specification or the introduction of a fully formed new policy, processes or workflows. Detailed plans will be developed over time with continued consultation with the community of users.

By the last phase of the consultation, we'll be able to answer these questions:

What are the the essential components of a user reporting system? Which of these are missing from current method of handling cases of harassment?
Will we build one feature, or more than one?
Will some aspects of the user reporting that we build be public and other aspects be private?
What are the important open questions that the product team should investigate and test prior to starting development?

プロセス

Our process has four overlapping parts:

  1. Research Wikimedia Foundation researcher Claudia Lo will lead research about current reporting workflows on Wikimedia projects and do a comparative analysis of user-to-volunteer reporting systems of online platforms, including Wikipedia. Additional research topics will emerge from on wiki discussions.
  2. Discuss Research will help inform and compliment the global community wide consultation about harassment reporting.
  1. Design When there is a solid understanding of what tool(s) or system(s) that are needed we will create several designs or prototypes to illustrate and test the leading ideas. We'll share these publicly for feedback to determine the best possible solutions.
  2. Build or assemble When we're confident that we've identified the best possible ideas to allow users to report other users for abusive behavior, our software development team will build the new tools. The Community health initiative will work with the community to advance any social or cultural type changes.
Timeline for the discussion and design phase of User reporting system
Phase Date Process
Phase 0 – Planning & Research 2018年7月 – 2019年2月 Gathering preexisting consultations, doing preliminary research, cross Wikimedia Foundation department coordination.
Phase 1 – Community Consultation; Set framework & call to participate 2月15日 – 3月15日 Getting the word out, form focus groups, identify volunteer liaisons for wikis and affiliates, offer existing research for review, explain scope of the project and decision making, solicit feedback about the process
Phase 2 – Community Consultation; Options 4月1日 – 4月30日 Gather and present to the community the existing options, collect new ideas, solicit feedback. Create a list potential options for more in depth legal & technical analysis and focus group review.
Phase 2.5 Analysis and winnowing of options 5月1日 – 6月30日 Focus group, User interviews, business case, prioritization
Phase 3 – Community Consultation; Summary of options and prioritization 7月1日 Present a direction forward with the product(s) prioritized for development by Anti-Harassment Tools team and the next steps for other non technical solutions.
Phase 4 – Transition to software development and reorg community policy or processes 2019年7月–? Transition to software development on Phabricator and Meta Community health initiative pages project pages. Further explore the role of volunteers working with in the system and how reports can escalate to different levels.

協議、設計、開発の各プロセスについて意見を述べる

Please read this page and provide feedback on the overall process for this User reporting system consultation. Tell us what's missing or unclear. Suggest more ways to include more parts of the on wiki community and the larger Wikimedia movement.

裁量

To make a good decision about a new user reporting system it is essential to listen and learn from you and the rest of the Wikimedia movement. We are genuinely seeking to understand and take into account the feedback we receive. However, the final decision about the type of software to support and Trust and Safety approaches will be made by the Wikimedia Foundation, after consideration of all the available information and the Wikimedia movement's 2030 strategic direction.

ローカルのウィキの自己統治

The Wikimedia Foundation’s general approach, as described, in the Terms of Use, section 10, is to respect local self-governance of the volunteer communities it supports where possible. This approach, in which we partner with community members rather than seeing them as customers, does differentiate our methods from those of most other large websites’ Trust & Safety approaches.

While the Wikimedia Foundation's Community health initiative will make the final decisions, we are entering this process with a strong interest to learn about existing community practices for handling cases of harassment and other types of abusive user conduct. It is essential for a new user reporting system to integrate into the local wiki methods of self-governance. Local policies remain primary on all Wikimedia projects, as explained in the Terms of Use, and Trust and Safety office actions are complementary to those local policies.

信頼安全チーム (ウィキメディア財団)

There are some rare instances when Wikimedia Foundation Trust and Safety will take actions to protect the safety of the community and the public. This happens where actions on local community governance level are either insufficient or not possible. There may be some rare cases where the Wikimedia Foundation must override local policy, such as to protect the safety of the Wikimedia communities or the public. See Trust & Safety Office actions for more details.

裁量機構に伝えるには

There may be difficult decisions to make about about which software features to prioritize and how to allocate resources toward other aspects of a user reporting system. When it comes time to make a decision, all valid options will be weighed by the following criteria:

  • Which option(s) most aligns with Wikimedia movement values?
  • Which option(s) is most in alignment with Strategic Direction of Knowledge Equity?
  • Which option(s) most aligns with the goal "to build a new harassment reporting system that produces higher quality reports that can be successfully processed and does not further alienate victims of harassment."
  • Which option will result in more accessible user experience, for anyone on any device?
  • Which option will result in a more sustainable product that will be resilient to changing technologies, evolving use cases, and user expectations?
  • Which option(s) do not introduce undue risk for achieving our project goals?

皆さんにお伝えください

The success of this project depends on collecting ideas and feedback from people with a variety of different roles in the Wikimedia movement.

  • Translate this consultation page.
  • Sign up to volunteer to share information about this consultation on your wiki or to your Wikimedia affiliate organization.


Join a focus group

In order to collect ideas and feedback from people with a variety of roles and experiences in the Wikimedia movement, a focus group(s) will be formed.

利用者の報告システムの詳細を知るには

The Community health initiative, lead by Wikimedia Foundation researcher Claudia Lo, is conducting research projects related to user reporting systems.

The findings generated from this research will guide this and ongoing community consultations, inform product decisions, as well as provide a more thorough understanding for wiki communities about how reporting systems operate compared to other Wikimedia communities and other online platforms.

英語版ウィキペディアの報告システムの概観

 
Enwiki Reporting system summary

It is available as a summary report, with an accompanying diagram of formal reporting spaces on English Wikipedia.

This project provides a summary of existing reporting systems, both formal and informal, that exist on English Wikipedia. It also defines some key terms for future conversations in this topic, such as involved users, the difference between formal and informal systems, and language for discussing broader concerns of community values and issues of labour. We find that some key issues with the current reporting system include its difficulty of access, eroded trust in current systems, the unstructured nature of noticeboards, and the clash between a desire for transparency and concerns over safety and privacy. We also find that informal systems, relying on existing relationships between editors, serve as an ad hoc private reporting system, as well as a complimentary structure to formal systems that can guide reporters through the often-tricky process of going through the official reporting procedures.

利用者発-ボランティア宛ての報告システムの要注意点

This project took place over February 2019, and produced both a summary report, and an accompanying detailed report.

On Wikipedia, most content and conduct disputes are handled by groups of volunteers. Accordingly, reports of such disputes are first routed to them, and only in cases of immediate danger or outsized harm do reports bypass this volunteer system and go directly to the Foundation’s Trust & Safety team. At this stage in our ongoing project on creating a reporting systems for Wikimedia projects, we could learn from investigating peer platforms’ implementation of private reporting systems. Not only could this comparison provide interesting design examples for us, new editors' expectations for reporting systems are being shaped by the reporting systems they encounter elsewhere online.

Though many platforms online incorporate some form of reporting system, these typically channel user reports directly to an in-house or contracted team of employees. This makes them most analogous to the use of the emergency wikimedia.org reporting channel. However, this makes them different to the types of reporting systems to which our community is accustomed, and thus the types of reporting systems we will be expected to use as a basis for design.

By conducting a review of existing best practices documents and research on this subject, we can create an assessment rubric to evaluate private peer-to-volunteer reporting systems. Some of the most prominent platforms using such a system include Reddit and Facebook Groups. We can run these platforms through this rubric, and additionally compare the current state of Wikipedia’s reporting systems, for a comparative understanding of these mechanisms.

これまでの調査

 
 
Enwiki Reporting system workflow

The Wikimedia Foundation's Anti-Harassment Tools team wants to better understand these existing systems to identify any pain points or shortcomings that we can address with improved software. Our research will heavily focus on English Wikipedia as the largest wiki community but we aim to build a tool that can be used by any wiki of any size. Research completed to date includes:

関連項目