Sistem Pelaporan Insiden

This page is a translated version of the page Incident Reporting System and the translation is 31% complete.
Outdated translations are marked like this.
Incident Reporting System

    Yayasan Wikimedia ingin mengembangkan cara bagaimana pengguna yang mengalami pelecehan dan bentuk-bentuk lainnya melaporkan insiden merugikan seperti itu untuk memberikan lingkungan yang lebih aman dan lebih sehat bagi komunitas.

    Tim Trust and Safety Tools telah diberi tugas untuk membangun Private Incident Reporting System (PIRS). Kami bermaksud memudahkan pengguna untuk melaporkan insiden berbahaya secara aman dan rahasia.

    Latar belakang proyek

    Pelaporan dan pemrosesan insiden berbahaya telah menjadi topik yang menarik bagi komunitas Wikimedia selama bertahun-tahun. Dengan disiapkannya sebuah Kode Etik Universal yang baru, sangat penting pula untuk memulai perbincangan mengenai sistem pelaporan pengguna.

    Cara menangani insiden dan pelanggaran kebijakan pada seluruh ruang dan proyek Wikimedia telah berkembang secara organik dan bervariasi di berbagai komunitas.

    Setiap proyek atau komunitas Wikimedia memiliki cara mengelola berbagai hal. Pelaporan dan pemrosesan insiden terjadi dalam berbagai cara:

    • melalui halaman pembicaraan wiki
    • melalui papan pengumuman
    • melalui surel
    • melalui diskusi pribadi di saluran komunikasi di luar wiki (Discord, IRC)

    Bagi sejumlah besar pengguna, masih belum jelas apa yang harus dilakukan jika terjadi insiden: ke mana harus pergi, siapa yang harus diajak bicara, bagaimana cara melapor, informasi apa yang harus disertakan dalam laporan, bagaimana laporan diproses, apa yang terjadi setelahnya, dll.

    Para pengguna harus memahami cara melaporkan suatu masalah dan di mana harus melakukannya. Informasi tentang apa yang akan terjadi setelah laporan dibuat dan harapan apa yang harus dimiliki pengguna juga sangat sedikit.

    Beberapa pengguna tidak merasa aman untuk melaporkan insiden ketika terjadi karena rumitnya proses pelaporan dan karena masalah kerahasiaan.

    Saat ini tidak ada cara yang dibakukan untuk pengguna mengajukan laporan secara rahasia.

    Fokus proyek

    Oleh karena itu, tujuan utama dari proyek ini adalah untuk mempermudah penanganan kasus pelecehan dan insiden berbahaya.

    Kami ingin memastikan privasi dan keamanan para pelapor. Kami juga ingin memastikan bahwa laporan mengandung informasi yang tepat dan sampai ke entitas yang berwenang untuk memprosesnya, tanpa menimbulkan beban tambahan pada pihak yang melakukan pemrosesan.

    Tim Trust and Safety Tools juga melihat sistem pelaporan insiden ini sebagai bagian dari ekosistem manajemen insiden yang lebih besar untuk, misalnya pekerjaan pencegahan seperti mengelola perselisihan sebelum meningkat, pemrosesan insiden, menghubungkan dan melacak kasus, dll.

    Pembaruan teknis dan produk

    Incident reporting update: November 7, 2024

    We’re continuing the work on the Incident Reporting System.

    The next step is to develop an MVP (Minimum Viable Product) that we can test in production on a few pilot wikis. Following our Minimum Testing Product (MTP) user testing some months ago, we have made improvements to the design:

    • Emergency incidents - Incidents relating to immediate threats of physical harm. These incidents need to be handled by the Wikimedia Foundation emergency team.
    • Non-emergency incidents - Incidents that are not immediate threats of harm, for example bullying, sexual harassment and other unacceptable user behavior. These incidents are handled through local community processes.

    The Emergency user flow will direct users to file a report that will be sent to the Wikimedia Foundation emergency team.

    The Non-emergency user flow will direct users to reach out to their local community for support, as outlined in community policies. This will be done through a “Get support” page that will contain links and information specific to each community. The intention is to have configuration options on this page so that each local community can add the relevant links as necessary.

    We have some screenshots to demonstrate how this might work. For the next deployment the main focus is to test the emergency user flow in production.

    Designs

    Emergency flow:

    Non-emergency user flow:

    We’d love to hear your thoughts on the current designs! Please comment on the discussion page.

    Test the Incident Reporting System in Beta – November 2023

    We invite editors to test the initial version for the Incident Reporting System. It makes it possible file a report from the talk page where an incident occurs. This version is for learning about filing reports to a private email address (e.g., emergency(_AT_)wikimedia.org or an admin group). It doesn't cover all scenarios, like reporting to a public noticeboard. We need your opinions to see if this approach is effective.

    To test:

    1. Visit any talk namespace page on Wikipedia in Beta that contains discussions. We have sample talk pages available at User talk:Testing and Talk:African Wild Dog you can use and log in.

    2. Next, click on the overflow button (vertical ellipsis) near the Reply link of any comment to open the overflow menu and click Report (see slide 1). You can also use the Report link in the Tools menu (see slide 2).

    3. Proceed to file a report, fill the form and submit. An email will be sent to the Trust and Safety Product team, who will be the only ones to see your report. Please note this is a test and so do not use it to report real incidents.

    4. As you test, ponder these questions:

    • What do you think about this reporting process? Especially what you like/don’t like about it?
    • If you are familiar with extensions, how would you feel about having this on your wiki as an extension?
    • Which issues have we missed at this initial reporting stage?

    5. Following your test, please leave your feedback on the talk page.

    If you can't find the overflow menu or Report links, or the form fails to submit, please ensure that:

    • You have logged in
    • Your Beta account email address is confirmed
    • Your account has been created for over 3 hours and you have at least 1 edit.
    • You have enabled DiscussionTools because the MTP is integrated with DiscussionTools

    If DiscussionTools doesn’t load, a report can be filed from the Tools menu. If you can't file a second report, please note that there is a limit of 1 report per day for non-confirmed users and 5 reports per day for autoconfirmed users. These requirements before testing help to reduce the possibility of malicious users abusing the system.

    Please see the research section of the page for more.

    Proses kerja

    Project phases
    Mengetahui bagaimana mengelola pelaporan insiden di ruang Wikimedia bukanlah tugas yang mudah. Terdapat banyak risiko dan banyak hal yang belum diketahui.

    Oleh karena ini adalah proyek yang kompleks, terdapat beberapa iterasi dan fase yang berbeda. Untuk setiap fase ini, kami akan mengadakan satu atau beberapa siklus diskusi untuk memastikan bahwa kami berada di jalur yang benar dan dimasukkannya masukan dari komunitas lebih awal, sebelum terjun ke fase pekerjaan yang lebih besar.

    Fase 1

    Penelitian pendahuluan: mengumpulkan masukan dan membaca dokumentasi yang ada.

    Melakukan wawancara untuk lebih memahami ruang lingkup masalah dan mengidentifikasi pertanyaan-pertanyaan penting yang perlu dijawab.

    Mendefinisikan dan mendiskusikan kemungkinan arah tuju produk dan ruang lingkup proyek. Mengidentifikasi proyek wiki yang dapat dijadikan tempat percontohan.

    Pada akhir fase ini, kami idealnya telah memiliki pemahaman yang kuat tentang apa yang akan dilakukan.

    Fase 2

    Membuat purwarupa untuk mengilustrasikan ide-ide yang muncul di Fase 1.

    Membuat daftar opsi yang memungkinkan untuk konsultasi dan tinjauan yang lebih mendalam.

    Fase 3

    Mengidentifikasi dan memprioritaskan ide-ide terbaik.

    Transisi ke pengembangan perangkat lunak dan memecah pekerjaan dalam tiket Phabricator.

    Melanjutkan siklus untuk iterasi berikutnya

    Penelitian

    July 2024: Incident Reporting System user testing summary

    In March 2024, the Trust & Safety Product team conducted user testing of the Minimum Viable Product (MVP) of the Incident Reporting System to learn if users know where to go to report an emergency incident, and if the user flow makes sense and feels intuitive.

    We learned the following:

    • During user testing, all participants found the entry point to report an incident and the current user flow is well understood.
    • There was some confusion over two of the reporting options: “someone might cause self-harm” and “public harm threatening message”.

    Two participants also made assumptions about the system being automated. One participant was concerned about automation and wanted a human response, whereas the other participant felt assured by the idea it would check if the abuser had any past history of threats and offences, and delete the offensive comment accordingly. All participants expected a timely response (an average of 2-3 days) after submitting a report. Read more.

    September 2023: Sharing incident reporting research findings

     
    Research Findings Report on Incident Reporting 2023

    The Incident Reporting System project has completed research about harassment on selected pilot wikis.

    The research, which started in early 2023, studied the Indonesian and Korean Wikipedias to understand harassment, how harassment is reported and how responders to reports go about their work.

    The findings of the studies have been published.

    In summary, we received valuable insights on the improvements needed for both onwiki and offwiki incident reporting. We also learned more about the communities' needs, which can be used as valuable input for the Incident Reporting tool.

    We are keen to share these findings with you; the report has more comprehensive information.

    Please leave any feedback and questions on the talkpage.

    Pre-project research
     

    Dokumen berikut ini adalah tinjauan lengkap dari penelitian dari tahun 2015-2022 yang telah dilakukan oleh Yayasan Wikimedia mengenai pelecehan daring pada proyek-proyek Wikimedia. Dalam tinjauan ini kami telah mengidentifikasi tema-tema utama, wawasan, dan bidang-bidang yang menjadi perhatian serta menyediakan tautan langsung ke literatur.

    Tim Trust and Safety Tools telah mempelajari penelitian sebelumnya dan konsultasi komunitas untuk menginformasikan pekerjaan kami. Kami meninjau kembali proposal mengenai sistem pelaporan pengguna (dari proyek Community Health Initiative) dan permohonan pendapat mengenai sistem pelaporan pengguna tahun 2019. Kami juga mencoba memetakan beberapa alur penyelesaian konflik di seluruh proyek wiki untuk memahami bagaimana komunitas saat ini mengelola konflik. Di bawah ini adalah peta alur resolusi konflik pada Wikipedia bahasa Italia. Di dalamnya terdapat catatan mengenai peluang-peluang untuk otomatisasi.

     
    Di Wikipedia bahasa Italia, ada kebijakan tiga tahap untuk penyelesaian konflik. Peta ini menggambarkan proses ini dan mencoba mengidentifikasi peluang untuk otomatisasi bagi penyunting dan pengurus.

    Frequently Asked Questions

    Questions and answers from Phase 1 of the project

    Q: The project used to be called PIRS, Private Incident Reporting System. Why was the P dropped?

    We have renamed the Private Incident Reporting System as Incident Reporting System. The word "Private" has been removed. In the context of harassment and the UCoC the word “Private” refers to respecting community members’ privacy and ensuring their safety. It does not mean that all phases of reporting will be confidential. We have received feedback that this term is therefore confusing and can be difficult to translate in other languages, hence, the change.

    Q: Is there data available about how many incidents are reported per year?

    A: Right now there is not a lot of clear data we can use. There are a couple of reasons for this. First, issues are reported in various ways and those differ from community to community. Capturing that data completely and cleanly is highly complicated and would be very time consuming. Second, the interpretation of issues also differs. Some things that are interpreted as harassment are just wiki business (e.g. deleting a promotional article). Review of harassment may also need cultural or community context. We cannot automate and visualize data or count it objectively. The incident reporting system is an opportunity to solve some of these data needs.

    Q: How is harassment being defined?

    A: Please see the definitions in the Universal Code of Conduct.

    Q: How many staff and volunteers will be needed to support the IRS?

    A: Currently the magnitude of the problem is not known. So the amount of people needed to support this is not known. Experimenting with the minimum viable product will provide some insight into the number of people needed to support the IRS.

    Q: What is the purpose of the MVP (minimal viable product)?

    A: The MVP is an experiment and opportunity to learn. This first experimental work will answer the questions that we have right now. Then results will guide the future plans.

    Q: What questions are you trying to answer with the minimum viable product?

    A: Here are the questions we need to answer:

    • What kind of reports will people file?
    • How many people will file reports?
    • How many people would we need in order to process them?
    • How big is this problem?
    • Can we get a clearer picture of the magnitude of harassment issues? Can we get some data around the number of reports? Is harassment underreported or overreported?
    • Are people currently not reporting harassment because it doesn’t happen or because they don’t know how?
    • Will this be a lot to handle with our current setup, or not?
    • How many are valid complaints compared to people who don't understand the wiki process? Can we distinguish/filter valid complaints, and filter invalid reports to save volunteer or staff time?
    • Will we receive lots of reports filed by people who are upset that their edits were reverted or their page was deleted? What will we do with them?

    Q: How does the Wikimedia movement compare to how other big platforms like Facebook/Reddit handle harassment?

    A: While we do not have any identical online affinity groups, the Wikimedia movement is most often connected with Facebook and Reddit in regard to how we handle harassment. What is important to consider is nobody has resolved harassment. Other platforms struggle with content moderation, and often they have paid staff who try to deal with it. Two huge differences between us and Reddit and Facebook are the globally collaborative nature of our projects and how communities work to resolve harassment at the community-level.

    Q: Is WMF trying to change existing community processes?

    A: Our plan for the IRS is not to change any community process. The goal is to connect to existing processes. The ultimate goals are to:

    • Make it easier for people who experience harassment to get help.
    • Eliminate situations in which people do not report because they don’t know how to report harassment.
    • Ensure harassment reports reach the right entities that handle them per local community processes.
    • Ensure responders receive good reports and redirect unfounded complaints and issues to be handled elsewhere.