Strategy/Wikimedia movement/2018-20/Transition/Discuss/Provide for Safety and Inclusion

This is an open discussion space dedicated to defining potential priorities for implementation from the Provide for Safety and Inclusion recommendation.

Code of ConductEdit

  •   Support Ad Huikeshoven (talk) 20:25, 22 November 2020 (UTC) ; actually I prefer the Contributor Covenant version 2, to the concoction. What should already have happened is ratification by the board. The discussion should have started about how we are all going to enforce any guideline in this respect, before supposing any specific guideline. A code of conduct is just paper, without enforcement.
  •   Oppose Most of this year's enwiki arbitration candidates have expressed concerns about the drafts, especially one submitted to the Board. Not just the candidates, many other editors also have concerns and/or are against implementation of the Code during feedback phase. We shall not make this a high- or top-priority if the communities remain divided about this "initiative" and if the impact of the Code would be potentially tremendous and irreparable. Furthermore, I'm unsure whether the Code is effective in bringing and hold communities together as promised. George Ho (talk) 01:14, 23 November 2020 (UTC)
  •   Oppose to this edition, tailored by WMF to coverup disturbed personalities they use to enforce their strategies in smaller wikis. User:PEarley (WMF) has deleted and omitted Workplace bullying / mobbing / moral harassment as well as Culture of fear and Toxic workplace from the conduct of code.   ManosHacker talk 14:00, 24 November 2020 (UTC)
  •   Oppose. Too controversial, wait until the WMF has rebuilt the trust of the community. I don't oppose a code of conduct, I oppose a code of conduct being dictated by the WMF and arbitrary, heavy-handed, centralised enforcement. MER-C (talk) 19:37, 24 November 2020 (UTC)
  •   Oppose. Benjamin (talk) 06:46, 28 November 2020 (UTC)
  •   Oppose. The WMF did not ask if this was something that was desired. While the WMF has insisted that it is intended only for smaller wikis which have not had time to develop conduct policies of their own, it does not say that. Nor is there an opt-out mechanism for communities which already have their own established policies and processes and would prefer those. Those deficiencies are fatal, and to be blunt, with the level of trustworthiness WMF has shown, seem a backdoor to pulling another Fram incident and pointing to this. Seraphimblade (talk)
  •   Oppose The Foundation is the poster-child of "the road to hell is paved with good intentions". They actively sabotaged the community's efforts to clean up pervasive abuse on Azwiki, the Foundation royally screwed up in the Framban case, doubled-down on that screwup, and is trying to ram this dysfunctional CodeOfConduct initiative down our throats. Alsee (talk) 19:36, 2 December 2020 (UTC)
  •   Oppose. Conduct policy on Wikimedia wikis is set by the communities, not by the WMF. The WMF is not authorized to establish such a policy. (Additionally, the organization clearly does not know how, in practice, to put together an appropriate policy.) --Yair rand (talk) 07:20, 14 December 2020 (UTC)
  • Thank you to all who commented for the thorough feedback. Although Code of Conduct has been highly prioritized as a priority for globally-coordinated implementation across the Wikimedia movement, it has not been one of the initiative clusters discussed at the 5-6 December Global Conversations. There is an existing space where such discussions can probably take place and receive wider and more efficient engagement. If interested, you can also checkout the top initiative clusters that will be further discussed for implementation in 2021-22, as a follow up to the Global Conversations and to the discussions here --Abbad (WMF) (talk) 15:58, 17 December 2020 (UTC).

Private incident reportingEdit

Create pathways for users to privately report incidents, either technical or human, including harassment, to have them addressed effectively and with appropriate urgency regardless of language or location.

  •   Neutral, partly   Oppose. Of course, such pathways would help in certain situations. In other situations, though, they could lead to the result that too many communicative problems in the Wikimedia communties are not adressed openly in a free discussion any more, but are "solved" by "punishing" the conflict party that has first called WMF for help. Similar situations already happened and I don't want to see them happen more often. A way to privately report "human incidents, including harassment" should never be a standard way of dealing with conflicts between Wikimedians. It should only be used (1.) in situations that really mean an acute personal danger for one of the parties – like already possible via emergency wikimedia.org – or (2.) (maybe) in very small Wikis. I use the plural "communities", because one can distinguish a global community, often called the "movement", and many communitys in the single Wikipedia language versions/Wikidata etc. See also [1]. --DerMaxdorfer (talk) 16:49, 2 December 2020 (UTC)
  •   Oppose private reporting, until the Foundation stops violating it's own commitment and The Global Community Consensus that they cease abusively meddling in routine content disputes. Alsee (talk) 20:00, 2 December 2020 (UTC)
  • Thank you both for your useful feedback. Based on results from here and the Global Conversations, there's now a dedicated part of the space to discuss the highly-prioritized initiatives for global implementation in 2021-22 (which do not include this initiative). You're invited to continue the conversation there, if interested --Abbad (WMF) (talk) 16:05, 17 December 2020 (UTC).

Baseline of community responsibilitiesEdit

Establish a baseline of community responsibilities for safeguarding and maintaining a healthy working atmosphere for both online and offline Wikimedia involvement, along with procedures for reporting and follow-up.

  • ..

Develop a safety assessment and execution planEdit

Research and develop an adaptable safety assessment and execution plan, as well as rapid response and support infrastructure. This would ensure that on- and offline contributors have available resources ready and accessible to mitigate the harm caused by their Wikimedia activities, including:

  • psychological support (e.g., therapists, counselors, mediators),
  • technical support (e.g., anonymization mechanisms),
  • legal assistance (e.g., list of partner lawyers, facilitation of legal representation at a local level),
  • a fast-track escalation path in life-threatening situations,
  • procedures for reacting to large scale challenges, such as socio-political unrest and environmental threats,
  • training and opportunities to raise awareness and build response capacity for community safety and conflict management
  • ..

Advocacy - local capacity developmentEdit

Develop local capacity for advocacy for legal and regulatory frameworks our communities need to make our projects thrive.

  • ..

Built-in platform mechanisms for safetyEdit

Establish built-in platform mechanisms aimed at the safety of contributors in their contexts (e.g., anonymization mechanisms). To be assessed for opportunities and risks for particular projects or contexts including the risk of harassment and vandalism.

  •   Oppose the masking project. The Foundation can terminate IP-editing if it wishes, and the community can prohibit masked edits. Alsee (talk) 20:12, 2 December 2020 (UTC)
  •   Oppose, as long as no outsourced psychological evaluation is established for bold admins / people chosen by WMF, to support / lead their strategies in small communities.   ManosHacker talk 03:02, 3 December 2020 (UTC)
  • Thank you both for your useful feedback. Based on results from here and the Global Conversations, there's now a dedicated part of the space to discuss the highly-prioritized initiatives for global implementation in 2021-22 (which do not include this initiative). You're invited to continue the conversation there, if interested --Abbad (WMF) (talk) 16:04, 17 December 2020 (UTC).