Talk:Community Safety

Latest comment: 1 year ago by TAndic (WMF) in topic over sampling

Thank you for your input! edit

Use this discussion page to provide your feedback and ask questions about the Community Safety survey. Please remember to avoid sharing about specific instances of harassment on this page as the goal of this project is to measure and report on feelings of safety in specific communities in aggregate. If you need resources for harassment, we encourage you to begin with your local governance structures or refer to the Trust and Safety portal. Thank you for your participation and help in making this survey project better! - TAndic (WMF) (talk) 14:55, 30 November 2021 (UTC)Reply

Alexandr Snajdar (talk) 09:17, 31 March 2022 (UTC) I am missing the option "I feel uncomfortable because I am not sure to meet the Wikipedia standards ;) "Reply

Thank you for your comment @Alexandr Snajdar! I believe your option would fall into the category of "Feeling unsafe or uncomfortable could also include other aspects of contributing that cause one to feel fearful or avoidant to continue contributing" (per the content page). If someone feels unsafe or uncomfortable enough to answer "Yes" to the question, no matter the reason, they should do so (and vice versa) :) We cannot tell people how they feel or whether their specific reason for feeling that way is valid or not; our aim here is to understand broadly whether or not users feel safe. - TAndic (WMF) (talk) 12:23, 31 March 2022 (UTC)Reply

Surveyed twice edit

FYI, I was surveyed twice on the same account, at the same location -- if there is need for more information to debug, feel free to ping me at the the foundation: User:Astinson (WMF), Sadads (talk) 20:25, 4 April 2022 (UTC)Reply

Thanks for the feedback. –– STei (WMF) (talk) 17:59, 5 April 2022 (UTC)Reply
In case others stumble onto the same issue, there are a few known reasons this can happen:
  • clearing your cookies
  • switching browsers or browser profiles
  • switching devices
Since the survey functions on randomly sampling browser session tokens rather than usernames, switching devices, browsers, or clearing cookies can cause it to sample someone again. See the relevant Frequently Asked Questions section for more information.
Thanks once more for the feedback! - TAndic (WMF) (talk) 19:31, 5 April 2022 (UTC)Reply

Abuse of access edit

There is an Eichmann behind the affected users by abuse of access in Persian Wikipedia. I feel no safety, and no one from the Foundation cares about me, because there is a prejudice based on block log. Ruwaym (talk) 14:03, 26 June 2022 (UTC)Reply

Community Safety/Methodology edit

The higher the level of insecurity, the more people leave Wikipedia.

How do you include absentees in your statistics ?

What do you do with harassed people who have not answered you, because they are no longer there?

Thank you for your answer. JMGuyon (talk) 17:13, 28 September 2022 (UTC)Reply

Hello @JMGuyon and thank you for your thoughtful questions. I apologize for taking so long to answer, but I wanted to dig into this a bit because it's a very good methodological point.
You are correct that we have no way to contact users who have left a wiki entirely with this survey. As with any data, we must think of the limitations of what it can tell us. This one data point tells us what is happening at this moment, rather than what has happened in the past which led up to this moment.
However, I would speculate that we are likely capturing at least some sentiments of users who have continued to read a wiki but have stopped participating in editing. The reason I say this is that the survey randomly samples logged-in users while they are reading, rather than while they are editing or after they have completed an edit. This means that even users who have stopped editing are eligible to be randomly sampled, as long as they have ever made at least 5 edits on that wiki and are still visiting the site.
In part I base this speculation on the 2015 Harassment Survey, which used a similar method: CentralNotice to randomly sample on-wiki, but it likewise had a sample of inactive users contacted via email in order to account for exactly what you're noting. The 2015 Harassment survey has a question for how often the person edits a wiki, with the option to say they no longer contribute. Among those who were sampled on-wiki via CentralNotice, about 4% noted they no longer contribute. As the barrier to participation in the Community Safety survey is lower (the survey is only 1 question, and does not require a user to leave the site to participate), I would assume (but of course cannot confirm) that the likelihood of someone who no longer edits but continues to read a wiki participating in this survey is slightly higher.
I hope my message helps to at least partially answer your questions, and thank you once more for asking. - TAndic (WMF) (talk) 11:49, 17 October 2022 (UTC)Reply
I think when a person has been harassed to the point of leaving Wikipedia, they stop reading Wikipedia, because the place becomes traumatic.
The word "harassment" has a meaning: harassment leaves traces (psychic traces).
Is it possible to know the previous results of your survey on fr:wp, for example those of 2015, with the same method?--JMGuyon (talk) 00:36, 9 January 2023 (UTC)Reply
Hi @JMGuyon – As the survey was conducted years before I joined the Foundation, it would take some exploration to see what is possible and then time to do the analysis. Do you have a specific use for the 2015 data? If I have a good reason to do additional work, for example that it would assist a specific community considering administrative projects to implement, it is much easier to ask my managers to consider the request in the context of other projects. Thank you and I look forward to hearing from you! - TAndic (WMF) (talk) 12:23, 23 January 2023 (UTC)Reply

Codes of Conduct make me feel unsafe edit

The Community Safety page says that the survey of feelings of safety is part of the Code of Conduct development and enforcement process. There seems to be an assumption that creating, mandating, and enforcing a Code of Conduct would make people feel safer. That may be true for some people, but it is not true of everyone.

I want the participants in that process to understand that I, for one, feel LESS safe contributing to a project that has a Code of Conduct and/or an enforcement process.

I have seen these codes and processes used to target and harass people and to drive them out of projects. I myself have been driven out of a project over an allegation of harassment or discrimination that was patently false (it was based on the "feelings" of others who I never met but who had read my work, not on any actual intention or act of harassment by me). I was given no useful opportunity to contest it. Attempts to speak calmly and civilly in a safe place, seeking to come to a common understanding with those who had these feelings, were explicitly refused by enforcement intermediaries. It was all about propagating intolerance rather than fostering greater understanding. I was in the end voted out of the project by what amounted to a lynch mob. Reason and proportion had no sway, it was all about emotion, and once accused, I was considered guilty, at which point it didn't matter what the facts were.

My situation was not unique, and I have seen several other such lynch-mobs who attacked people I know personally, over "feelings of unsafety", and tried to drive them out (sometimes succeeding and sometimes failing). While CoCs claim to outlaw "harassment", they do not outlaw the use of a CoC process to harass someone; they empower and protect that new form of harassment. That is the abuse that I have seen over and over. (See the entire book, "en:The Coddling of the American Mind" for a larger overview of what's wrong with the movement toward banning anyone who triggers feelings of "unsafety" in others.)

The people attacked tended to be iconoclasts, those who do not follow the herd, whose imaginations are not limited by conventional thinking. Those who are different. It's an example of majorities trying to suppress minorities by giving them no right to exist. In other words, Codes of Conduct and their enforcement mechanisms tend to drive away the original thinkers and retain the more common, thoughtless, or mediocre people. As I try to be an original thinker, I perceive the whole CoC movement to be aimed at driving off people like me. It has long been observed that "all progress depends on the unreasonable man", who will think differently, who will not merely accede to conventional thoughts or actions. This movement toward mediocrity is likely to retard progress in the projects affected by it, rather than improving the projects.

Seeing Wikipedia go down this route paved with lynch-mobs and blaming their victims makes me feel less safe, rather than more safe. It makes me less inclined to participate in Wikipedia. I am concerned that merely posting this point of view makes me more likely to be harassed, as a dangerous "CoC denier".

I just thought that you should question the basic assumption that creating, imposing, and strengthening CoCs makes people feel safer. Gnuish (talk) 11:34, 7 October 2022 (UTC)Reply

Commentaires/Feed-back edit

(French) Merci de l’enquête, intéressante, mais pas assez détaillée, j’aimerais cependant faire part de mon impression :

  • Ayant été harcelé sur Wikipédia.fr par divers Bannis (Papa Franck, Noah Sokolowski et Apokrif) de WPFR, je trouve l’enquête pas assez détaillée (qui ? Comment quand ?), ces ajouts à mon avis, vont permettre d’améliorer cette enquête déjà fort utile et nécessaire

(English) Thanks for the survey, interesting, but not detailed enough, I would like to share my impression though:

  • Having been harassed on Wikipedia.fr by various Banished (Papa Franck, Noah Sokolowski and Apokrif) of WPFR, I find the survey not detailed enough (who? How when?), these additions in my opinion, will improve this already very useful and necessary survey

Felix felines (talk) 19:29, 8 January 2023 (UTC)Reply

Hello @Felix felines, thank you for your message. I agree that more details in this data would be helpful to better understand what is happening and to guide which actions should be taken or prioritized by community administrators. At this moment it simply lets us know that a phenomenon is happening, and whether it is changing over time. The next step in the survey is to develop the QuickSurvey tool to ask a second question so that we can ask why people have felt unsafe or uncomfortable. However, this will require developer assistance and will take time. I will be working on proposals for how to answer questions such as yours in the coming months, and I hope I can ask you and other Fr.WP editors for some assistance when that time comes. For now, I am working on a series of Diff blogs that look more closely at this data and the Community Insights 2022 safety and harassment questions that I hope can be useful as well.
Thank you once more, and please let me know if you have any more questions or ideas you would like to share. - TAndic (WMF) (talk) 12:28, 23 January 2023 (UTC)Reply

This survey's methodology is inherently flawed to a degree that renders it unfit for use edit

Hello,

As we are now no longer in the stated pilot phase of this survey, I need to more bluntly stress the fundamental flaw of this survey: just knowing how many people have had issues tells you nothing of value to Wikimedia. Or, more precisely, take the current setup, with Community Safety having two choices: use it to justify an (in)action or don't use it. The latter obviously wouldn't do anything, and the former would be completely unsuitable on its own.

The fundamental additional questions needed are:

  1. Did you utilise any Wikimedian process designed to resolve that discomfort?
  2. Did they adequately resolve the source of the discomfort?

Without this, the data is subject to be claimed by anyone, to justify almost anything. Its use by T&S would fatally undermine whatever purpose they were mentioning it in relation to, as a clear misuse of data.

Preferably, it would include comment boxes to include "why did you not utilise any process/what process(es)?" and some others, but if brevity is desired, the above two are the fundamental ones. My apologies for being somewhat blunt, but discussions with WMF staff members last year does not appear to have resolved the issue and it is critical that it be amended prior to any use, for any purpose. @STei (WMF) and TAndic (WMF): Nosebagbear (talk) 16:30, 10 January 2023 (UTC)Reply

i saw "did you feel unsafe ...." and clicked "yes" just to see what happens. nothing happend. i cannot even undo. mine is only one voice, but a wrong one :) --ThurnerRupert (talk) 09:07, 15 January 2023 (UTC)Reply
Hello @Nosebagbear, thank you for your message. Please do not feel the need to apologize for your bluntness, I see honest critique as a gift in any research process (as long as it’s not mean-spirited and is genuinely meant to improve research!).
As others have commented, I agree that more follow-up details behind the survey data would be greatly beneficial in making this data useful for communities. There is one major issue in the QuickSurvey tool itself in implementing this, in that it can only ask one question at a time. The tool was originally developed for a different purpose, and while we continue to advocate for improvements to our on-wiki survey tools, still it’s part of my role to use the best tools available to answer key questions in the meantime; unfortunately, the on-wiki tooling needs to be developed further in order to be able to ask follow-up questions of users who felt unsafe. Once we have that capability, my first priority is to find out why users are feeling unsafe or uncomfortable (as it can range from individual incidents to a general sense of insecurity based on external global factors) so that communities can distinguish between the ways improvements could be made (technological, administrative, etc).
Still, I do see some ways which may help to partly answer the questions you noted above, at least among what are defined as “active editors” in the Community Insights survey. The 2022 Insights survey asked several questions about safety and harassment, including two scales that may be especially useful to finding out whether users even know where to report an incident:
  • I know how to get help if someone is attacking, bullying, or harassing me on the Wikimedia projects.
  • I am confident that I can get the support I need if I am being attacked, bullied, or harassed.
While they are not quite what you ask, I think, considering the wide variation in reporting mechanisms across wikis, establishing whether users know where to report discomfort or harassment is a necessary precursor to your questions. I am still working on the analysis and report for the Community Insights survey, as well as a series of Diff blogs on this safety & harassment data which will come out over the next few months. I hope to separate out the findings for the large wikis sampled in the survey so that we can get more detailed information. I can ping you as soon as these data are available, if you find them helpful.
Thanks once more for your feedback, and I hope you won’t hesitate to write again if you have further questions or comments! - TAndic (WMF) (talk) 12:35, 23 January 2023 (UTC)Reply

the proportion of new users in each language version of wikipedia edit

Is this proportion the same in all language versions?

The feeling of insecurity is stronger among new users. The newer you are, the more frequent trolling you experience.

It seems to me necessary to give the proportion of new users in the different language versions at the same time as the results of your survey.

The risk of being trolled is higher as long as the account is fragile, that is to say before the count has reached approximatively 5000 contributions. JMGuyon (talk) 20:34, 12 January 2023 (UTC)Reply

Hello again @JMGuyon! Though we do not know the proportion of new users, we do know the edit range of the respondents which can function as a fairly good proxy for experience. When we see differences between groups, we apply weights to the data so that it reflects the 2021 population by edit count bucket. For example, for Fr.WP, we apply weights to the dataset to make the population look like this: 5-99 edits: 52.24%, 100-999 edits: 23.78%, 1000+ edits: 23.98%. In general, what happens is that we slightly increase the proportion of the 5-99 edits and 100-999 edits editors by making their responses “count a little more”, and slightly decrease the proportion of 1000+ edits editors. This makes sense as to why the respondent population does not look like the actual population: people who have made 1000 or more edits tend to spend more time on a wiki, and thus are more likely to “run into” the survey.
I think I may have somewhat of an answer to your risk assumption though: Last week, I posted a Diff blog with the breakdowns for what percent of users in 2022 felt unsafe by lifetime edit count on each wiki based on this survey. What we’re seeing is that users with more edits are more likely to have felt unsafe or uncomfortable, and when we look at the Community Insights data, this is also observed not only in feeling unsafe and uncomfortable but also for those who have been harassed in the past year. This also makes sense as a first look, as making more edits means more interactions, and thus a higher chance of a negative interaction occurring. However, there is a counterfactual to this – it’s possible that users who are more experienced are also more likely to be able to recognize when they feel unsafe or uncomfortable, or that a negative interaction should be understood as harassment. Likewise, it’s possible that users with more edits tend to take on more administrative roles, and thus have more chances for a conflict to occur. Finally, more experienced users may be better attuned to broader conversations about conflicts and events which occur on a wiki that can make users feel unsafe. I don’t think this information negates the hypothesis that new users are more likely to experience trolling (and, as you noted previously, we do not know what happened to those who have left) but I think it does help us to better understand the extent of issues experienced users face in conducting their editing, while showing all of us that there is an opportunity to ensure less experienced editors do not face the same problems as they continue to contribute.
Thank you as always for your comments, and let me know if I misunderstood anything or if I can answer any other questions. - TAndic (WMF) (talk) 12:51, 23 January 2023 (UTC)Reply

over sampling edit

I have been prompted hundreds of times in two days. If folks answer some random number of those prompts, the results will be meaningless. —¿philoserf? (talk) 19:14, 14 January 2023 (UTC)Reply

Hello @Philoserf – I apologize for the overwhelming amount of survey requests you received. This quarter’s survey is now complete and no one will be seeing them again for a few months.
To troubleshoot this and make recommendations for fixes if they’re needed, can I ask a couple clarifying questions?
  • Do you mean hundreds literally?
  • Did you answer or click the X to opt out of the survey?
If you did not opt out or answer, the survey will continue to ask you on different pages until you exit out of it. That should account for the hundreds of prompts, if you meant hundreds literally, but it would mean a user is only sampled once.
Known reasons the survey can sample someone more than once are:
  • clearing your cookies
  • switching browsers or browser profiles
  • switching devices
Since the survey functions on randomly sampling browser session tokens rather than usernames, switching devices, browsers, or clearing cookies can cause it to sample someone again.
If it’s none of the above, please let me know any details you can provide (feel welcome to email me at tandic-ctr wikimedia.org if you prefer to discuss privately) so I can try and resolve this!
As far as meaningful results: While there is some noise in the data collection process despite random sampling, we go through a cleaning process to delete duplicates from the dataset based on a few variables we get from the data. While no process can be perfect, the large scale of numbers we’re working with and the consistency across time of the data indicates to me that it’s quite reliable.
Thank you for raising this, and please let me know if I can answer any questions. - TAndic (WMF) (talk) 12:58, 23 January 2023 (UTC)Reply
Yes, I make hundreds of edits.
I answered the survey the first time.
I clicked the X to dismiss the survey many times.
I ignored the survey many more times.
I edit mainly from Safari on iPadOS. —¿philoserf? (talk) 13:05, 23 January 2023 (UTC)Reply
Thank you very much for the clarification. I'll look further into into why this might be happening, and I apologize once more for the annoyance. -TAndic (WMF) (talk) 14:02, 23 January 2023 (UTC)Reply
Return to "Community Safety" page.