System Zgłaszania Zdarzeń/Postępy

This page is a translated version of the page Incident Reporting System/Updates and the translation is 19% complete.
Outdated translations are marked like this.
System Raportowania Incydentów

    Zapraszamy edytorów Wikimedia do przetestowania minimalnej wersji testowej systemu raportowania zdarzeń. Zespół produktu w ramach Trust and Safety stworzył podstawową wersję narzędzia, które pozwala zgłosić zdarzenie z poziomu dyskusji strony, na której owo zdarzenie wystąpiło. Uwaga: ta wersja narzędzia przeznaczona jest do testowania przesyłania zgłoszeń na niejawne adresy mailowe, takie, jak emergency(_AT_)wikimedia.org, czy do grup administratorów. Nie obejmuje ona wszystkich scenariuszy użycia, jak na przykład zgłaszanie zdarzeń na publiczne tablice ogłoszeń. Potrzebne są Wasze komentarze, by sprawdzić, czy takie podejście jest skuteczne.

    Aby przetestować narzędzie:

    1. Wejdź na dowolną stronę dyskusji Wikipedii Beta. Przykładowe strony dyskusji, dostępne po zalogowaniu to User talk:Testing i Talk:African Wild Dog.

    2. Kliknij ikonę wielokropka blisko linku "Reply" w dowolnym komentarzy i wybierz polecenie "Report" - patrz slajd 1. Link do zgłoszenia zdarzenia znajdziesz również w menu Tools - jak na slajdzie 2.

    3. Wpisz zgłoszenie, wypełnij formularz i wyślij go. Do zespołu Trust and Safety, i tylko do niego, zostanie wysłany email. Pamiętaj, to tylko test, nie korzystaj z formularza do zgłaszania rzeczywistych zdarzeń do zespołu T&S.

    4. Podczas testowania, zastanów się nad poniższymi kwestiami:

    • Co sądzisz o takim procesie zgłaszania, a szczególnie co Ci się w nim podoba i nie podoba?
    • Jeśli znasz rozszerzenia MediaWiki, co myślisz o wdrożeniu takiego rozszerzenia na swojej wiki?
    • Czego zabrakło na tym etapie zgłaszania zdarzeń?

    5. Po przeprowadzeniu testu, wypisz swoje uwagi na tej stronie dyskusji.

    Jeśli nie widzisz menu z wielokropkiem lub linku "Report", albo jeśli formularz się nie przesłał, upewnij się, czy:

    • jesteś zalogowany/-a
    • Twój adres na Wikipedii Beta został potwierdzony
    • Twoje konto istnieje od co najmniej 3 godzin i wykonało co najmniej 1 edycję
    • włączyłeś/-aś rozszerzenie DiscussionTools, gdyż narzędzie które testujesz, jest z nim zintegrowane

    Jeśli narzędzie DiscusionTools nie ładuje się, możesz to zgłosić z poziomu menu "Tools". Jeśli nie udaje Ci się przesłać drugiego zgłoszenia, zauważ, że dla użytkowników niezatwierdzonych limit wynosi 1 raport dziennie. Dla użytkowników zatwierdzonych jest to 5 zgłoszeń dziennie. To ograniczenie w fazie testowania ma zapobiec złośliwym testom.

    Sharing incident reporting research findings – September 2023

    Research Findings Report on Incident Reporting 2023

    We have completed research about harassment on selected pilot wikis. The research, which started in early 2023, studied the Indonesian and Korean Wikipedias to understand harassment, how harassment is reported and how responders to reports go about their work.

    We have published findings of the studies.

    Four Updates on the Incident Reporting – July 2023

    Hello everyone! For the past couple of months the Trust and Safety Product team has been working on finalising Phase 1 of the Incident Reporting System project.

    The purpose of this phase was to define possible product direction and scope of the project with your feedback. We now have a better understanding of what to do next.

    1. We are renaming the project as Incident Reporting System

    The project is now known as the Incident Reporting System, with the word "Private" removed.

    In the context of harassment and the UCoC the word “Private” refers to respecting community members’ privacy and ensuring their safety. It does not mean that all phases of reporting will be confidential.

    We have received feedback that this term is confusing and can be difficult to translate in other languages. Therefore we are removing it.

    2. We have some feedback from researching some pilot communities

    We are conducting research on harassment in the Indonesian and Korean Wikipedia communities. With their feedback, we have been able to document how users in these communities report harassment and created maps out of the information. These maps represent, to the best of our knowledge, how community members on both wikis currently report incidents of harassment and abuse.

    If you have any feedback on these maps, you can give it on the talkpage.

    3. We have updated the project’s overview

    What we want to build moving forward

    • The Trust & Safety Tools team will be developing an extension for reporting incidents/UCoC violations.
    • The extension is intended to be configurable, communities should be able to adapt it to their local processes
    • The extension name is ReportIncident
    • The purpose of the extension is to:
      • Facilitate the filing of reports about various types of UCoC violations by Wikimedians
      • Route those reports to the appropriate entities that will need to process them
      • Facilitate the filing of reliable reports and filter out/redirect the unactionable ones.
      • Facilitate the filing of both private (e.g. to an email address) as well as public (e.g. on-wiki to an Admin noticeboard) reports according to local processes.
    • Extension is intended to be incident agnostic  (ability to support the reporting of different types of incidents)

    What we won’t be doing

    • The system is intended for reporting and routing only, we will not be dealing with processing reports
    • The system is intended for incidents with regards to UCoC violations. We will not use this for other type of requests (such as technical support requests, account access etc)
    • The system is NOT meant to replace existing processes on wikis. Our purpose is to make it easier to follow existing processes.

    4. We have the first iteration of the reporting extension ReportIncident

    In November last year we talked about how we should start small with a very limited scope, so  for our first iteration we thought about creating a very basic experience.

    What’s included in this initial iteration?

    • Ability to report from User Talk page
      • Report a topic header
      • Report a comment
    • Ability to complete a basic form and submit
    • The report will be sent to an email address (a dummy email for testing purposes).

    Designs

    The first version of the MTP (minimum testable product) will let a Wikimedian report an abusive topic header or comment on a talk page. Here are the designs.

    Implementing Designs – What’s next

    The Trust and Safety Product team is now working on developing these initial designs as an MTP, a proof of concept that will be deployed to Beta-cluster and tested internally. The purpose of this is to assess technical viability. If everything goes well the next step is to deploy to test.wikimedia.org for usability testing and feedback.

    Looking forward to your feedback about this first iteration on the talk page!

    listopad 2022

    Our main goal for the past couple of months was to understand the problem space and understand what people are struggling with, what they need, and their expectations around this project. We did this by:

    • Reviewing and synthesizing harassment research, surveys and other relevant documentation (going back to 2013)
    • Having user interviews with volunteers who have experienced or witnessed harassment on Wikipedia
    • Having discussions with Staff members, UCoC drafting committee and wiki functionaries.

    Our purpose was to identify priorities, scope and a possible product direction.

    Wnioski i dalsze kroki

    Skupienie się na Bezpieczeństwie

    The recommendation from the Movement Strategy discussions is to provide for safety and inclusion within the communities. As our ultimate goal is for people to feel safe when participating in Wikimedia projects, we will use this as the guiding principle for what to focus on in the minimum viable product (MVP).

    Podejście do projektu: zacząć jako mały

    There are a lot of things to take in consideration when thinking about this project.

    • Many types of Users: reporter, responder, observer, accused, monitor
    • Many Use cases: doxing, abuse of power, content violations, security breaches, legal issues etc.
    • A lot of Complexities: admins as harassers, off wiki harassment, government interference etc.

    This project will grow and become more complex over time. So we need to start really small, with a very limited scope before we dive into anything more complex.

    Skupienie na dwóch rodzajach użytkowników

    Określiliśmy różne rodzaje użytkowników:

    • Zgłaszający: Użytkownicy, którzy doświadczyli nękania i przesyłają zgłoszenie.
    • Responders: Users who receive the report, and want to help.
    • Accused: The users who are named in the report.
    • Monitor: People who are interested in tracking the progress of reports, to understand the problem better or to ensure that people are treated properly.

    Since we want to start small, we will focus on reporters and responders first.

    MVP Approach (Short-term)

    The way we would like to approach this is to build something small that will help us figure out whether the basic experience actually works.

    Principles of the MVP:

    • We will design for and test and release on a few pilot wikis
    • Since our goal is to address safety we are going to focus only on 3.1 (Harassment) in the UCoC.
    • We will explore a basic experience for two user groups only:
      • Reporters will understand how to file a report, and feel comfortable enough to complete the report process.
      • Responders will receive clear reports, giving them the information that they need in order to understand the problem.
    • MVP will connect to current systems as they are (we are not changing any existing processes)

    This experiment should also help us explore and answer some important questions and learn things as we go:

    • Entry points (where reporting starts) – what are they, should we have one or more?
    • Users – do people easily discover the entry point? What do they think will happen when they engage it?
    • Scale – can we do this at scale? Will we overwhelm the responders? etc.
    • Data – can we build something that will help us collect the data we need in order to make decisions? What can we measure to know we’re moving in the right direction?

    Czego (jeszcze) nie robimy

    The idea is to start with a really small scope, try a few things and learn as we go. Therefore we need to be very clear about what we are not going to do yet:

    • We are not solving for bad admins and/or other complex use cases
    • We are not fixing existing flawed processes
    • Not everything in the UCoC is about safety but we are focusing only on safety
    • Agnostic reporting – we cannot do this without validating a basic reporting experience works with a specific type of incident
    What happens after the MVP (long-term)

    We have some ideas about v2 and v3 but we want to experiment with an MVP first and see how people feel about it. What we learn now will be useful to make decisions about future versions.

    Some v2 and v3 ideas include:

    • Private reporting (creating a private space for reporters and responders to interact)
    • Escalation (having the ability to route cases to a different entity for further support)

    In order to explore these two ideas we need to ensure the basic/core experience actually works. If it does we will build on top of it.

    Discussions points

    • What do you think about this approach?
    • What scares/concerns you about this project?

    Looking forward to your feedback on the talk page!

    September 2022

    We have been collecting feedback, reading through existing documentation and conducting interviews in order to better understand the problem space and identify critical questions we need to answer. We are currently synthesising the information we have collected in an effort to start defining a more clear scope for the project. It is a lot of information to go through so this might take a while, there's so many things we need to learn!