Universal Code of Conduct/Training/Module 3 - Complex cases, appeals and closing

Introduction edit

Start here edit

This module builds upon the foundational knowledge provided in the previous modules, with a focus on enhancing your understanding and skills in managing more complex UCoC violations. The aim is to get you ready for handling tricky situations that go beyond just dealing with rule-breaking. This includes assessing threats, kind and sensitive communication, reporting and engaging with a safety first mindset. Furthermore, we will guide you through the process of handling appeals and closing cases. This includes understanding the appeals process, making informed decisions, and ensuring fairness and transparency.

By the end of this module, you will:

  • Gain an understanding of and managing complex UCoC violation cases
  • Learn to assess the credibility of threats and communicate effectively and sensitively in challenging situations
  • Acquire skills to protect the safety of victims and vulnerable community members
  • Understand the appeals process for UCoC violations, including how to handle, evaluate, and close appeals with fairness and integrity

Audience edit

This is an advanced module designed for those looking to deepen their engagement with the UCoC, particularly prospective U4C members, advanced rights holders, and community-elected functionaries who have signed the Access to Nonpublic Personal Data Policy. Completion of the previous modules is a prerequisite to ensure a comprehensive understanding of the UCoC framework. This training is crucial for those aiming to join the U4C, as well as for those who seek to uphold the highest standards of conduct within the Wikimedia movement.

Note edit

Considering the continued applicability of the material, content in this module was reused and adapted from the training modules created by the Trust and Safety team as part of their 2016-17 annual plan goal to “introduce new, translatable, online training modules to help functionaries and affiliates better manage issues.”

Safety First edit

Your safety edit

Curating the world’s largest encyclopedia comes with certain risks, and being prepared is the best way to stay safe. As a seasoned Wikimedian taking this course, you are likely well aware of some of the uglier sides of Wikimedia, such as harassment, threats, or intimidation.

Indeed, community members working towards upholding the UCoC can become targets themselves and have their names and communications spread on the internet. When communicating with both the reporting party and the accused, use some simple rules to protect yourself.

Realize that anything you write may be shared or "leaked" publicly. Think about how your words could be taken out of context, or used against you, as you write. For communications regarding your Wikimedia role, consider using a separate email address, one that does not give personal details in the address name. Do not give personal details in your communications. Sometimes it is tempting to give personal experiences to show you empathize with someone suffering harassment (e.g. "I saw a very similar situation when I worked at my campus help center in Mumbai"), but you need to protect your own privacy.

Privacy is at the heart of staying safe online because the more information about you a bad actor can find, the more harm they can do. This may be ironic for Wikimedians, where the platform and movement are built on transparency and openness, but it's a balance that must be held.

Privacy checkup: edit

  • Username: Take a moment to reflect on your username. Does it reveal any personal details? Learn more about how a username can keep you safe.
  • Personal information: Have you shared any personal details on your userpage or elsewhere on Wikimedia projects?
  • Edit history: Can your edits be traced back to you as an individual? In the early days of our Wikimedia journey, we often edited on familiar topics like our local school or a landmark. Could such details inadvertently reveal our identity?
  • Broader internet: Considering the interconnected nature of the internet, it's vital to also contemplate your wider online presence and the information that potential bad actors could extract from it.

Further reading: edit

Activity
Fill out this information mapping worksheet (Currently a Google doc, if the activity is accepted it be changed). The goal of this worksheet is to help you identify and think through your online presence by mapping your digital footprints across various apps, services, and platforms. Then, take action if deemed needed.

Remember, knowing what information you have shared online, where and who has access to it will help you assess threats made against you.

But, how can you identify potential threats and vulnerabilities before they happen? The process is called threat modeling. More on this in the next section.

Threat modeling - what is your safety plan? edit

A threat model is a list of the most probable threats to your privacy and safety. Since it's impossible to protect yourself against every threat, the idea is to think through the most probable threats and create a plan to reduce the chances of that happening.

Focusing on the threats that matter to you narrows down your thinking about the protection you need, so you can choose the tools and steps that are most relevant.

To help in the process, consider the following questions:

What do you want to protect? edit

In the context of digital safety we are mostly trying to protect information. For example, your real name, your address, emails, files, text messages etc... You also may want to guard against someone impersonating you, say by making edits from your Wikimedia project.

  • → Make a list of your assets: data that you keep, where it's kept, who has access to it, and what stops others from accessing it.

Who do you want to protect it from? edit

Here we need to think about who might want to do you harm and try to identify them.

As Wikimedians this can include a variety of parties including fellow community members who might be upset about a certain action you took, some organization who is not happy about the information you are adding or perhaps even the government.

Thinking about who might want to target you and why will allow you to think about appropriate steps you can take to protect your information and yourself.

  • → Make a list of your adversaries or those who might want to get ahold of your assets. Your list may include individuals, a government agency, corporations, a vengeful ex etc....

How likely is it that you will need to protect it? edit

When considering how likely it is that you will need to protect your assets, it's important to think about the capabilities of potential bad actor(s). For example, the capabilities of a disgruntled Wikimedian are quite different from a hate group or a government.

It is important to distinguish between what might happen and the probability it may actually happen. For the list of bad actors you’ve written down, rate both the risk that they will attack you and their capability, i.e. how likely it is that they would be successful.

  • → Write down which threats you are going to take seriously, and which may be too rare or too harmless (or too difficult to combat) to worry about.

How bad are the consequences if you fail? edit

A safety plan involves understanding how bad the consequences could be if a bad actor successfully gains access to your information.

For example, consider if a bad actor is able to figure out your real name or what you look like or your sexual orientation, how bad would that be for you?

Consider what a bad actor might want to do with your private data and how bad the consequence might be.

  • → Write down what your adversary might want to do with your private data.

How much trouble are you willing to go through to try to prevent potential consequences? edit

To answer this question, you will need to go back to your responses in question three.

  • → Write down what options you have available to you to help mitigate your unique threats. Note if you have any financial constraints, technical constraints, or social constraints.

See this threat modeling checklist by Online SOS to help you think through this exercise and learn more on the Electronic Freedom Foundation’s security plan. Now, what do you do if you are faced with any actual threat?

Assessing threats edit

Our understanding of safety and security are deeply subjective, and as such, an accurate assessment is based on your understanding of yourself and the context in which you find yourself. Consider being a functionary in a place like Russia compared to Germany. The different contexts would drastically change your sense of safety. Now add to it your gender, sexual orientation or profession, how would that change your assessment?

While there is no one-size-fits-all way to assess threats, below are a few steps you can take to guide your thinking and calculation.

  • Situate yourself: Consider your role within Wikimedia, your level of involvement in the community, and any personal characteristics or affiliations that may make you a target or would affect an assessment of the threats made against you.
  • Contextualise Wikimedia: The risks of engaging with Wikimedia are drastically different depending on where you are, consider Iran and India for instance. As such to accurately assess threats against you, you have to contextualize Wikimedia to where you are located.
  • Study the source: What can you learn about who is making the threats? Is it someone you know or have interacted with before? Are they a fellow community member? Are the threats made on-wiki, off-wiki or a combination? What is the threat about? Why are they threatening you?
  • Evaluate credibility: Analyze the language and tone of the threats, as well as any past behavior of the sender, to assess the credibility of the messages. Consider whether there is any evidence to support the threats.
  • Gauge severity: Determine the seriousness of the threats and the level of risk they pose to your safety and well-being. Consider whether the threats are specific and actionable or more general in nature.
  • Consider impact: Assess the potential impact of the threats on yourself, your community, and the broader Wikimedia ecosystem. Consider potential consequences and any collateral damage that may occur.

Ultimately, a threat assessment is about figuring out if a threat is serious and who's making it, so you can take steps to stay safe and protect yourself and others. It helps you understand the danger and decide what to do.

All of this can be mentally and emotionally taxing; as such, taking care of those aspects of your health is just as important.

Your emotional and psychological wellbeing edit

Dealing with situations involving transgressions of community rules and potential reactions to yourself can be tough. It is important to know how to effectively care for yourself when handling a case, regardless of the outcome.

It's not uncommon to feel personally invested in cases, especially those without easy solutions or that uncover upsetting information. You may experience "secondary trauma" or "caregiver burnout," which is a common feeling of guilt or mental exhaustion experienced by those providing care to others. Your ability to care for others depends on keeping yourself safe and healthy enough to effectively give that care.

Importantly, it is good to remember that you are never obligated to take an action you aren’t comfortable with, particularly in cases involving potential harassment. If you're concerned that answering a question or taking action might result in harassment directed at you, either onwiki or on external websites, consider asking another team member to handle it.

For resources on dealing with harassment, trauma, witnessing upsetting content, caregiver burnout, or general stress management, please visit the Mental Health Resource Center. This page includes a curated list of resources for and about mental health in non-emergency situations. If you are facing an emergency situation, please refer to the list of helplines or emergency numbers listed on the page.

Safety and vulnerable communities edit

Safety is not a uniform concept and does not apply equally to everyone. There are numerous studies that show people from some communities are targeted more frequently and with higher severity - both online and offline. This is also true in the Wikimedia context.

Some communities are more vulnerable than others due to historical power imbalances, social prejudice, gender disparities and other factors.

What one person considers safe may not be the case for another. Wikimedians come from diverse backgrounds and have different experiences, perspectives, and understandings of what safety and security mean.

Because of the personal nature of these concepts, addressing safety and security concerns can be a challenge. When addressing cases with a safety aspect to them, consider these questions:

  • Am I actively avoiding biases that may impact my judgment?
  • Am I demonstrating empathy without making assumptions about the user's experiences?
  • Is my communication using inclusive language and avoiding stereotypes?
  • Am I focusing on addressing inappropriate behavior rather than questioning the user?
  • Am I taking appropriate measures to safeguard their privacy and have those been clearly communicated?

Given the heightened frequency and severity of attacks, the importance of privacy in ensuring online safety becomes even more evident, particularly for vulnerable communities. So, how do we effectively manage personal information? Let's delve deeper into this topic in the next section.

Handling personal information edit

Nonpublic personal information (PI) is defined as any information that can be used to personally identify a user. This includes, but isn’t limited to, any of the following that have not already been made public by the user themselves on the projects or elsewhere:

The most common types of nonpublic PI handled by functionaries are email addresses, IP addresses, user agent information, and names. However, functionaries should be aware of the other types of information that qualify. In addition, although not personally identifying by themselves, each of the following is considered nonpublic PI when associated with either the above or with a user’s account:

  • date of birth;
  • gender;
  • sexual orientation;
  • racial or ethnic origins;
  • marital or familial status;
  • medical conditions or disabilities;
  • political affiliations; and
  • religion.

Consider these: edit

  • Because User:A publicly stated his real name, email address, address, and phone number on his user page, none of this is considered PI.
  • However, if User:A had not disclosed his real name, email address, address, and phone number, these would be considered PI.
  • User:A's IP address and user agent information is PI.
  • If User:A has not disclosed that he is unmarried, a statement that links his marital status with his username contains PI.

Be sure to refer to the following policies when dealing with personal information:

If you have any questions regarding these policies, please reach out privacy@wikimedia.org or legal@wikimedia.org.

What makes cases complex? edit

A complex UCoC case typically involves multiple layers of consideration. These factors not only complicate the process of addressing the case but also require an understanding of community dynamics, policies, and the principles underpinning the UCoC. Here are some elements that may contribute to the complexity of a UCoC case:

  • Cross-Wiki activities: Cases that involve actions or behaviors spanning multiple Wikimedia projects or languages introduce complexity due to the need for coordination across different communities, each with its own norms, policies and dynamics.
  • Safety concerns: Cases involving threats of violence, doxxing, or other safety concerns add a layer of complexity due to the need to navigate implications on the wellbeing of individuals involved.
  • Systemic bias or project capture: Situations where systemic bias, gatekeeping, or "project capture" occurs present complex challenges. Addressing these issues involves unraveling entrenched patterns of behavior and influence.
  • High-profile community members: Cases involving high-profile community members can be complex due to the potential for significant community backlash. Balancing the need for confidentiality, fairness, and the potential impact may need careful consideration.
  • Ambiguity in policy interpretation: Complex cases can involve ambiguity in policy interpretation, particularly when innovative or unprecedented issues arise. These cases may require a careful analysis of the UCoC, local policies, and the principles guiding the Wikimedia movement to reach a fair and principled decision.
  • Multi-faceted disputes: Disputes that encompass multiple issues, such as content disputes intertwined with personal harassment or conflicts involving several parties with conflicting narratives, increase the complexity of resolution.
  • Foundation involvement: In the case that the local community is unable to resolve a situation, Foundation involvement may be necessary to assist in resolving the dispute or taking appropriate action.

The multi-layered nature of complex UCoC cases increases the risk of misunderstandings and miscommunication. Such escalations can have negative effects on all parties involved and the UCoC process at large. To mitigate these risks, it is vital that everyone is on the same page regarding the facts of the case.. Therefore, before addressing a case, it is important to collect all relevant information and ensure alignment with the person making the request. Let's continue in the next section

Case Intake edit

This intake table serves to organize and structure the collection of information, facilitating clear communication and efficient case handling. It enables both the case handler and the complainant to establish transparency and understanding before proceeding with the case.

Draft: Case Intake Table
Question Option Details
Where did the violation occur?
  • Online - Wikimedia project spaces (eg. Talk pages, userpage etc..)
  • Online - Non-Wikimedia project spaces (Facebook group, Telegram channel…)
  • Offline - Wikimedia project space (Community events etc…)
  • Offline - Non-Wikimedia project space (randomly on the street, workplace…)
  • Both online and offline Wikimedia spaces
  • Both online and offline Non- Wikimedia spaces
  • Both online and offline, including Wikimedia and non-Wikimedia spaces
...
What is the type of violation?
  • Violation of project specific policies
  • UCoC violation
    • Harassment
    • Abuse of power, privilege or influence
    • Content vandalism/abuse of projects
  • Local laws
...
Who is sharing the complaint?
  • Single user
  • Group of users
...
Who is the complaint voiced against?
  • Single user
  • Group of user
  • Single or multiple project functionary
  • External party(s)
...
Was this violation or related issue addressed in the past? Is there a history to this violation? ...
Which Wikimedia projects does this violation touch upon? ... ...
Which Wikimedia projects does this violation touch upon? ... ...
... ... ...
... ... ...

Once the intake table is filled, the table can be shared with the requester to confirm accuracy and also gather more information.

[The details on how cases will be triaged towards other community governance bodies such as ArbCom, local functionaries etc.. remains to be determined.]

Activity edit

To be determined. A sample case where by one can test the intake table and next steps. Do you have any ideas?

Communication edit

Something to consider edit

Reporting UCoC violations, both publicly and privately, can be a difficult thing to do. It's natural to feel uncomfortable accusing someone of wrongdoing. In the case that the person in question is an influential community member, this is even more difficult and can carry some risks.

When people submit reports, they may be afraid that they won’t be taken seriously or that, even if they are listened to, the people who are supposed to help won't care.

When you or your team receives a request, bear in mind that it may have taken a great deal of courage and caused a lot of anxiety to the reporter. Treat every claim as serious, even if its tone seems bombastic or overwrought.

Next, let's learn about empathy and empathetic communication.

What is empathetic communication? Why is it important? edit

Empathy involves being able to put oneself in someone else's shoes and see things from their perspective.

In communication, empathy means listening, acknowledging emotions, and responding with care. The priority here is to use language that is assuring and non-confrontational.

Due to the nature of the Wikimedia movement, a large majority of our communication happens in the written form. It takes place on discussion pages, forums, mailing lists, and other channels of communication. This makes communicating with empathy a bit more challenging as we need to be mindful of how we use our words to connect with targets of harassment in a meaningful way.

The practice of empathy builds trust in communities and fosters a safe and supportive environment for reporters where all parties feel heard and valued. Let’s look at how this may look in practice!

Empathetic communication in practice edit

Below are some scenarios and what empathetic and non-empathetic communication styles can achieve:

Scenario Non-Empathetic Response Empathetic Response What is the response achieving?
A community member reports experiencing harassment. "Just ignore them, it's part of being online." "I understand that dealing with harassment is upsetting. Could you help me understand more about what happened? Let's address this together." Validates their feelings and opens a dialogue for resolution.
A user reports a false accusation. "If you didn’t do anything wrong, you have nothing to worry about." "I see these accusations are hurtful. Let's gather the facts and work on clearing this up." Acknowledges their distress and ensures steps will be taken to resolve the issue.
A user feels their complaint wasn't taken seriously. "We get lots of complaints, not all of them can be urgent." "I understand your concern. Could you help me understand more about your complaint? Every report matters." Validates their concern and ensures their complaint is reviewed appropriately.
A contributor is upset about being blocked after a misunderstanding. "Rules are rules, you should have known better." "I see that you are frustrated. Let’s discuss what happened and resolve this misunderstanding." Shows willingness to discuss and works towards resolving the issue.


A member reports being doxxed by another user. "That’s just part of being online. You should have protected your information better." "I understand that being doxxed is frightening. Please see these resources to learn more about online privacy." Validates the severity of the situation and takes action to help with their safety situation.
... ... ... ...

Key phrases to use:

  • "I understand" or "Could you help me understand": Show that you are trying to grasp the situation thoroughly.
  • "That must be (frightening/hurtful/upsetting)" or "I see that you are (frightened/hurt/upset)": Indicate active listening and empathy.

Phrases to Avoid:

  • "I disagree": This may lead the reporter to believe you are not here to help.
  • "Nothing we can do": Instead, offer advice or alternative routes forward.
  • "The person accused of harassment has a good reputation": Focus on the reported behavior or actions rather than offering opinions on the person accused.

Part of handling cases empathetically also includes managing expectations for a reporter by appropriately communicating about a case's progress and timelines. Let's explore this further.

Information sharing and expectation management edit

Sharing information and managing expectations play an important role in fostering transparency, trust, and fairness throughout the UCoC case process. Consider some of the best practices below:

Try to: Try not to:
  • Offer the target or reporter a timeline. Your goal should be to let them know what to expect. While you will likely not be able to promise a certain result or closure date, you should be able to give them a sense of the projected progress of their investigation. Consider whether you can offer the reporter a "check-in" date.
  • Alert them to any substantial delays that may alter the timeline you offered. Remember that while, for you, this may be one of a dozen active cases, for the reporter it is likely a much higher (and more emotional) priority. Sudden, unexpected silence or lack of apparent progress may feel alarming to them.
  • Contact users in a timely manner to request any additional information your investigation requires. Particularly when an investigation involves multiple people, small delays can compound – try not to add to that by putting off simple steps like asking an important question.
  • Overshare: While it might be tempting to provide as much detail as possible when discussing a case, it's important to remember that the involved parties are not impartial or confidential. Even if someone is accused of a violation, they still have a right to privacy and confidentiality throughout the investigation process.
  • Make promises you may not be able to keep. While you may wish to reassure a targeted user with "I promise we will stop this behavior" or "You will have an answer by Tuesday", such a reassurance will backfire if you are unable to follow through. Know your limits, both in time and in your role.

Next, let's look at some overall best practices around communication when dealing with UCoC cases.

Best practices edit

Below are some overall best practices around communication when dealing with UCoC cases.

  • Be prompt: This is arguably the key aspect to an initial response. Don't leave reporters waiting for a reply that may or may not ever come. If you are able to action the complaint immediately, do so and let the reporter know. If the case is complex and you cannot immediately offer a substantive response, let the reporter know in the meantime that you have received their message and will be investigating.
  • Be empathetic: Assume the report is genuine – at the very least, assume it is something that has genuinely distressed the reporting party. Respond kindly, letting the reporter know your team will look into it. Try to avoid generic replies where possible – make it clear that you are responding to their specific situation and that you are responding as another human being.
  • Give concrete timing information, and stick to it: Where possible, give estimates to the reporter on how long things will take to get moving. Be sure to allow yourself plenty of time in these timing estimates; things can come up, and delays can happen – this is not your full-time job, and you are not expected to be able to drop everything when a case comes up.
  • Be informative: This one is difficult, especially if the report came in privately. Being informative doesn't always mean being public or detailed; however, it's usually a good idea to at least keep the reporter up-to-date about the status of your investigations. Follow up with more emails as appropriate as the case goes on.
  • Ask for updates: Let the reporter know that they should forward new developments to you as they occur. If you feel that you need more information to complete your investigation, reach out to the reporter to ask for it.

Having learned about keeping oneself safe and the people we are supporting safe, and how to communicate in such difficult situations, let's move on to the practice of handling UCoC cases.

Documenting Reports edit

Documenting reports edit

Documentation of what you have learned and done in cases is very important for a few reasons.

  • First, private investigations performed off-wiki, as many complex UCoC cases will likely include, are not automatically documented the way on-wiki edits would be. The only thing that the future will know about is what you record.
  • Second, no single person or set of people who performed an investigation can be expected to remain in their role forever; if you drift away from your role in the future, others will need a way to find out what happened and why it happened in any given investigation.
  • On the other hand, be aware that "documentation" doesn't – and shouldn't – mean "public documentation". It also should not mean that you or your team compile a permanent dossier about any involved editors. The parties involved in an investigation are entitled to as much privacy as you can reasonably give them while still doing your job, and it is your responsibility to protect information about them and the investigation by storing it somewhere reasonably secure and not recording or saving extraneous information.

What does appropriate documentation look like? edit

To a certain extent, what "appropriate documentation" looks like will depend on who is performing the investigation. If you are performing an investigation as part of a team (such as an Arbitration Committee), your team should:

  • Record a summary of your investigation on your team's private wiki, if appropriate based on that wiki's policies.
  • Record the names of those who investigated and/or voted on outcomes for the investigation.
  • Take screenshots or gather diff links of evidence that informed any eventual outcomes of the investigation. Store these somewhere accessible to your team, such as on a private wiki or in an email to your team's secure, archived mailing list.
  • If you are evaluating an initial complaint before passing it on to users with advanced rights or to the Wikimedia Foundation's

Trust and Safety team, make sure that your communication to the other investigating group contains:

  • Contact information for you
  • Contact information for the reporting party and/or victim
  • A summary of the complaint
  • Functional links to any relevant URLs
  • Functional diff links to any specific on-wiki edits relevant to the complaint (if you have them)
  • A summary of any preliminary investigation work you may have done

Where should the documentation be stored? edit

The answer to this question depends on your role. Some groups, such as Arbitration Committees, may have their own "private" wiki. Other groups may primarily use email in their communications. Individuals receiving reports may have no designated place to document. So, work either in your team's designated space or in a secure document of your own creation to store the information.

Do not use an on-wiki "sandbox" for this, and avoid hosting your documents in publicly accessible places, such as an etherpad or unsecured "cloud" storage account. Consider privacy focused collaboration platforms where one can control the sharing settings, and even better if they are end-to-end encryption. See Privacy Guide’s recommendations for private-by-design collaborative tools.

While Google Docs or Microsoft 365 can be useful; you should pay careful attention to the "sharing" or security settings (see Google's help page on this topic). Additionally consider the for-profit and privacy disrespecting nature of these services. In either case, your documents may well contain personally identifying information, and a "leak" could permanently damage the reputation and public trust that users have in your group.

Closing and Appeals edit

Closing edit

Any case closure will need to start with documentation. Your documentation should have information about what the case was about, who investigated the report and what they found, and what the outcome was. Please see the section of Documenting Reports to refresh your understanding.

Closing non-actionable reports edit

Sometimes you may deal with cases with no clear actionable outcome like a banning or blockling.

  • Yet, even in such cases there are some things to consider.
  • Notify the relevant parties of the outcome and offer further resources for help if available.
  • You may wish to provide some detail about why the case was deemed non-actionable. Stick to relevant details, without becoming over-detailed.
  • Depending on the situation, you may need to notify the subject of the report. If the report was clearly mistaken and closed without investigation, the subject may not be aware of the complaint and need not be notified. However, if a full investigation was conducted, including communication with the subject and/or witnesses, it is important to inform the subject that the case is closed and provide them with the outcome.

Next, let's see how to close an actionable case?

Closing actionable reports edit

Closing an actionable report is a bit more involved, though it is based largely on the same steps.

  • The first step when closing an actionable case will mostly be to take any on-wiki action your team has decided on. The timing of this step is important. If it has become necessary to place a block or ban on a user, you want to avoid leaving them in a state of "nothing left to lose," especially if they have advanced user rights that could be misused in retaliation. That will mean placing any blocks or bans first, before notifying the sanctioned user.
  • Next, avoid a gap in time where the sanctioned user is left wondering why a sanction has been placed on them. Make sure to notify relevant parties immediately after placing any blocks or bans. This will include the sanctioned editor, first; the target or person who reported the case to you, second; and possibly any on-wiki venues your community requires sanctions to be posted in.
  • In severe cases involving advanced user rights, it may be necessary to contact stewards or bureaucrats for removal of those rights. However, these teams may not make immediate decisions on such requests. In such cases, you may need to assess if you can move forward with case closure (if you believe their decision will not impact the closing of the case), or wait for their decision.

When communicating with someone you are sanctioning, keep your statements factual and as non-judgmental as you reasonably can given the situation. Communicate clearly what action is being taken against them and, in general terms, why, and what they can do if they wish to appeal the decision. Even in the context of explaining how to appeal, however, it is not appropriate to provide a sanctioned user with the name of, or detailed information provided by, their accuser or target. As always, you should attempt to communicate with both targets and reported users with empathy.

In a movement and platform where openness and transparency is key, such action may warrant a public announcement. Let’s explore in the next section.

Deciding whether a public announcement is necessary edit

Once you have finished placing any necessary sanctions, you may need to make a public notice of the case outcome. Doing so will not be necessary in all cases, and you will need to use your best judgment and the local policies of your project when deciding what to say publicly about a case and how to say it.

Some projects, like English Wikipedia, publicly announce all removals of advanced user rights via a noticeboard. If your case's closure included one of these actions, your team will be expected to make a public statement about it. Other projects rely more heavily on individual administrator discretion, or on private discussion about these topics. If your project is one of those types, you should not make an undue spectacle of your case's closure.

If you decide that a public announcement is needed, you will likely have to balance providing the public with the needed information while also not compromising on the privacy of those involved. How can you achieve this balance?

Choosing what details to release in a public announcement edit

When announcing a case closure publicly, avoid disclosing the entire case, evidence, or investigation, as they may contain private or sensitive information that could put the reporter or perpetrator at risk. Aim for a factual and neutral announcement that does not harm the involved parties.

However, keep in mind that most community members will not have access to the detailed background of the case. Strive for an announcement that is clear and understandable to them, without sharing any sensitive information. Use your judgment to achieve a balance between transparency and privacy.

Things your announcement should contain:

  • The username of the sanctioned user
  • The basis of the case (for example, "harassment" or "misuse of private information")
  • The outcome of the case (for example, "user is banned" or "user's administrator rights are revoked")

Things that might be appropriate to include in your announcement:

  • On-wiki diffs of problematic behavior by the sanctioned user if and only if they are vital to describing this case, and they contain no private or hurtful information about either the targeted editor or others

Things that are not appropriate to include in your announcement:

  • Personal details of, or links to content that includes the personal details of, parties involved in the case.
  • The content of, or links to the content of, the harassment. The reason you or your team handled the case privately was because this content was potentially hurtful or embarrassing to the target.
  • When explaining the reasons for sanctioning a user, be cautious with the language you use and avoid explicit descriptions of their actions.

In some cases, you could be legally responsible for inaccurate or unverifiable statements you make about someone in public. Therefore, it's important to limit your statements to the information that is available to you.

Indeed, actions taken may attract questions from the public, especially if it is a complex case. How can you deal with such questions?

Responding to third-party questions about a case edit

After you release the statement, remember that community members who see your announcement may not be familiar with the details of the case or the involved parties. They may have questions ranging from the general to the very specific, and it's important to answer them while also protecting the privacy of the case.

While it's understandable that community members may want more information, you are not obligated to fully answer every question if doing so would reveal case details that are best kept private. Use your best judgment to determine where to draw the line, and consult with your team or colleagues if you're unsure. Aim to provide as much detail as you safely can, but no more.

After a case: Debriefing edit

After a case is closed, it's important to debrief with your team to reflect on the process and identify areas for improvement. Consider the following questions:

  • Was the outcome the best possible given the circumstances?
  • What went well in this case? (Note down any successful strategies or tactics.)
  • What didn't go well in this case? (Identify any challenges or obstacles.)
  • How can you improve the process for next time? (Consider any policy changes or additional resources that could be helpful.)

A debrief can help establish better processes and keep an informal record of your team's progress. It's best to debrief soon after the case while the details are still fresh in your mind. The debrief doesn't need to be public – it can be done on an email thread, a private messaging platform, or in person.

For the U4C, debriefing after a case is a crucial step to evaluate the effectiveness of UCoC enforcement actions, assess the current state of UCoC enforcement, and recommend suitable changes to the UCoC and UCoC Enforcement Guidelines for the Wikimedia Foundation and the community to consider.

Appeals edit

The principle of appeals involves providing individuals with a mechanism to challenge or question decisions that affect them. Appeals are crucial for ensuring that decisions are fair and just, and they offer a layer of protection against errors or biases.

What does a standard appeals process look like? edit

A standard appeals process typically follows these steps:

Starting the Appeal

  • The individual seeking to appeal must formally request an appeal. This can involve reaching out directly to the functionary who implemented the action, another advanced rights holder, the relevant local enforcement structure or when such does not exist the U4C.

Submission of Appeal

  • The appellant submits their appeal, providing all necessary information and documentation. This may include details about the action taken, reasons for the appeal, and any supporting evidence.

Review and Acceptance

  • The enforcement structure reviews the appeal to determine if it meets the criteria for consideration. Factors considered may include the verifiability of the accusations, the length and effect of the sanction, potential abuse of power, and the likelihood of further violations. The acceptance of an appeal is not guaranteed. It must meet the established standards for review.

Information Gathering

  • The enforcement structure gathers relevant information and perspectives on the case. This process should be conducted with sensitivity to the privacy of those involved and the integrity of the decision-making process.

Assessment of Appeal

  • The appeal is assessed based on several factors, such as the severity and harm caused by the violation, prior histories of violations, the severity of the sanctions being appealed, and the length of time since the violation. Any suspicions of abuse of power or systemic issues are also taken into account.

Decision Making

  • After a thorough review, the enforcement structure makes a decision on the appeal. This decision is based on the analysis of all gathered information and the mitigating factors considered during the review process.

Notification

  • The appellant and relevant parties are notified of the decision. If the appeal is successful, the enforcement action may be modified or overturned. If unsuccessful, the original action stands.

When does the U4C handle appeals? edit

If no local enforcement structure exists or when local capacities are unable to address a situation, then an appeal can be made directly to the U4C.

Can Foundation office actions be appealed? edit

Appeals are not possible against certain decisions made by the Wikimedia Foundation Legal department. However, some Wikimedia Foundation office actions and decisions are reviewable by the Case Review Committee. The limitation, specifically on appeals from office actions and decisions, may not apply in some jurisdictions, if legal requirements differ.