Community Resilience and Sustainability/Conversation Hour August 31 2023

You are invited to the quarterly Conversation hour led by Maggie Dennis, Vice President of Community Resilience and Sustainability, on 31 August at 17:00 UTC.

Maggie and others from the Community Resilience and Sustainability team will discuss Trust and Safety, the Universal Code of Conduct, Committee Support, and Human Rights.

This conversation will happen on Zoom. If you are a Wikimedian in good standing (not Foundation or community banned), write to let us know you will be attending the conversation and share your questions at at least one hour before the conversation. Please place “CR&S” in the subject line. Someone will follow up with the Zoom details.

If you do choose to participate in this conversation, Maggie would like to bring some expectations to your attention:

  • I can't and won't discuss specific Trust and Safety cases. Instead, I can discuss Trust and Safety protocols and practices and approaches as well as some of the mistakes we've made, some of the things I'm proud of, and some of the things we're hoping to do.
  • I will not respond to comments or questions that are disrespectful to me, to my colleagues, or to anyone in our communities. I can talk civilly about our work even if you disagree with me or I disagree with you. I won't compromise on this.

You may view the conversation on YouTube and submit questions live in Zoom and on YouTube.

The recording, notes, and answers to questions not answered live will be available after the conversation ends, usually in one week. Notes from previous conversations can be found on Meta-wiki.


  • Wikimania just wrapped up. I know there were staff there to support attendees with harassment or other UCoC violations. Is there really enough of a problem for that to be necessary? If it is, why don’t you staff all community conferences and gatherings?
There are enough problems with UCoC violations and it is important to have social spaces protected. With larger international events support from Community Resilience and Sustainability (CR&S) is vital. However we do understand that the community also provides support in these meetings, and paid Trust and Safety might not be necessary in all events. In the event there is more need for CR&S support, get in touch.
  • The Movement Charter is nearing completion. There are people on my wiki who say it doesn’t have anything to do with us and who think it’s something the affiliates and the Foundation are trying to force through. It’s the editors who make things happen, and this isn’t for or by us. What would you say to those people?
It is very hard to mobilize a community of thousands of people to all work on the same thing at the same time. It makes sense to me that we’re not all on the same page at the same time. The Movement Charter is important to everyone. We have had struggles between groups, and it is important that the Charter becomes something that is useful to everyone. We spend a lot of time debating processes and protocols when we could be doing work. The Charter needs to work for us all and to represent us all.
Risker/Anne (MCDC member): It is reasonable for editors and contributors to note that the Charter so far has little impact or relevance on projects or the editor communities. The MCDC wants to hear from editors and contributors, in whatever feedback channel they are comfortable with, for suggestions on how to make the Charter more relevant and useful for your community because the editors/contributors are one of the four groups (projects are another group) that must ratify the Charter for it to be implemented.
  • There was a law passed recently in Russia about working with non-profit organizations. What are the implications for volunteers living in Russia or individuals who volunteer and have some connection with Russia (citizens, etc.)?
President Putin signed a law that prohibited citizens from taking part with organizations that are unregistered in Russia. Individuals who are presumed to be working with unregistered NGOs can be caught in prison sentences that are up to 3 years. We do not want to increase our risk by suggesting this applies to Wikimedians. We advise all community members, including those inside Russia, to be vigilant and not to put themselves in harm's way. We also advise people not to use names that can be identifiable. We also do not want to put our community members at risk. There is a link of how to keep yourselves safe:
  • Could Human Rights folks on the call talk about how we could discourage "real name use" when creating accounts?
Resources around doxing and username safety were shared in the chat:
Doxing: why should you care?
Doxing: have you tried doxing yourself?
How can a username keep you safe?
There seems to be some longer term tension with the community between editors who think we should use real names for transparency vs. those who encourage pseudonymous ones for privacy. Using a real name is “great as long as it is great”. But, real harms and harassment do occur. Also, laws are changing around the world on online engagement that can put you at risk by using your real name. Our advice is to be extremely mindful about the use of real names. As a Global community, even if you are in a safe area, you may be affected by cross border issues and initiatives.
  • I know you won’t talk about specific trust and safety cases or human rights cases, and I understand why in most cases, but that can make it hard to talk about the issues behind them. It’s broadly known that some Wikipedians are in prison in one jurisdiction. Why won’t you talk about that case? Secondarily to this: what kinds of things does the Foundation do when people go to prison for editing?
If there is a jurisdiction when an individual is subject to be imprisoned for editing Wikipedia, then the risk is larger and wider to the community living in that region. Hence the need for the Wikimedia Foundation to be careful when operating in this area. We mostly work with organizations that have experience in this area. We let these organizations take the lead because they have experience in such situations. We then continue to sensitize community members on the importance of keeping your real names safe. Try not to use real names when editing in Wikipedia. As a global community we need to remember that by using our real names we may put folks who are in sensitive areas at risk. We need to be mindful of that.
  • There was a lot of press coverage in Singapore around the gender neutral bathroom. For one example, Singapore debates gender neutral toilets. Singapore is a pretty modern city, with laws protecting gay and transgender people, but some are still calling for a boycott of Suntec because they allowed this. It’s only been a few years since two transgendered tourists were arrested in the UAE for impersonating women. How can we ensure safety and inclusion for LGBT+ Wikimedians across the globe when even the right to exist can lead to arrests and protests?
We can’t ensure it. We can try for it. We can be mindful of where we go, how we act, how we treat each other, and how we stand up for it. The LGBT+ UG has spoken with us about the range of issues users face across the group, from debates about pronouns to threats to life. We operate within the world we live in, so we have no magic answers. But we will try to ensure safety and inclusion for LGBTQ+ Wikimedians by being present and by speaking out against injustice. We will listen, and we will try our best to do our best to ensure safety because this is something that matters to everyone and to encourage all contributors to think about and prioritize this.
  • You’ve committed to reviewing the UCoC policy a year after the enforcement guidelines are approved because you wanted to see them in action. What’s the point of that when the U4C won’t even be operational?
“Ignore all rules.” Not to ignore the UCoC, but if it’s not the right time because the U4C isn’t active, we are open to do things in a different way. In the event we need to review the policy once the U4C is operational, then we can do that.
  • My local wiki doesn’t have an arbcom. How do I report a UCoC violation? What happens after I make the report?
Most local wikis are large enough to have local processes. Local processes are a good place to start. We have many community members who are experienced at handling these things. You don’t always need the court of final appeal. However, even though I don’t want to encourage everyone to reach out to the Trust and Safety team, if you reach out to them they can tell you whether it is something they can help with and if not they may be able to point you to the right place to get help.
Jan: Community is the place to start. The UCoC and the EGs say so Once the community has endorsed the U4C, there will be an additional pathway.
  • Are there policies within Trust and Safety that require the Foundation to proactively update information of individuals that are globally-banned/banned by Trust and Safety? There was an incident in (this year’s) Wikimania where the on-site volunteer Trust and Safety team could not identify an individual that was banned by the Foundation Trust and Safety because of an information gap.
Onsite volunteer Trust and Safety members are meant to be given a list of banned members. We do not always have pictures of individuals.
Jan: As a matter of general practice, we update the list of the Foundation banned members, and we share with the respective teams planning events. The list may sometimes not be complete or up to date in terms of images, for example.
Maggie: If there are gaps we are interested in hearing about them so that we can see what to do.
  • Is it ok to share the details of a complaint filed to the Affcom or trust and safety with other community members?
It depends. The UCoC talks about ways that valid concerns can be used to harass other people. It is justifiable to talk about concerns you are having.
Once you have begun putting a concern into the proper processes, what is the reason you are talking about? What is your goal? I believe the UCoC speaks to this as though a reasonable bystander sees the intent to drive people away. What is going to be viewed is what is the reason you are speaking about it, where you are speaking about it, to whom and how. There is no policy that gives a yes or no to this question but we need to be mindful when we share this information. We should not use these cases as a weapon.
  • How are incoming regulations such as the Digital Services Act (DSA) going to impact the community content moderation/policy enforcement processes? The DSA (Digital Services Act) just went into effect on Friday. What is the Wikimedia Foundation doing to comply with this new act?
All laws and potential regulations have an impact on our processes. It is important to have proactive compliance. Wikipedia has been good at community policing: it is good at cleaning up violations to ensure that we are not on the wrong side of the law. Our lawyers like to say that the community is our best defense. Our community’s proactiveness not only helps us comply with laws, but also can help protect us from stronger laws being applied to us. There is a tension for desire to keep the internet safe and for community autonomy.
Jan: EU was good in making room for community self governance. The DSA gives room and recognition that the community can self govern themselves. We have concerns for the OSB (Online Safety Bill) because it does not make the distinction on community self governance. Incident reporting extension is an aspect that the DSA is interested in. DSA may be useful for the Wikimedia Foundation. It will be a long journey to get compliance right.
  • The UCoC includes language fluency and age. Is it possible under the UCoC for language based projects to have expectations of language fluency, and if so how do we differentiate between expectation of behaviour and expectations of proficiency?
In the past the English Wikipedia community has been more tolerant of past problematic behaviour when people have said "that was years ago when I was an adolescent, now I'm almost a legal adult", at the same time there are usually a bunch of opposes on maturity grounds when a candidate for adminship admits to not yet being a legal adult. Some opponents have been as specific as "come back when you are 18". We also use delete and oversight when children on the project disclose dates of birth or other personal data. Is all of this compatible with the UCOC or should the UCOC be amended to allow for children to be treated differently to adults?
Dealing with content that is not up to standard is not improper and does not constitute discrimination. How we do it may be the problem and may introduce discrimination. U4C may be the team that will make these calls including the age issue. Not everybody is suited for a role, for example there are roles that people under the age of 18 cannot do.
WereSpiel: We need something from the UCoC that actually says that dealing with content that is improper does not constitute discrimination.
  • I have a couple general questions about a recent event: When considering a global ban, to what extent and how does the Foundation consider (1) whether the subject is likely under local political pressure influencing their behavior, and (2) whether such a ban could harm movement interests?
Wikimedia Foundation bans are based on behavior. We do think about global issues such as political pressure. There are people who may be pressured to do something they do not want to do. What we look at is whether they are working against the local policies and are undermining the ability of the community to self govern. We look at patterns of behavior and how the activities affect us. We also consider movement interests before the bans are implemented.