Grants talk:Project/JackieKoerner/Addressing Implicit Bias on Wikipedia

Latest comment: 3 years ago by MJue (WMF) in topic Round 1 2020 decision

Scope edit

I have a hard time understanding what's the concrete proposal here. A need and potential method are identified, but nothing at all is said on how this specific project would add something new or positive.

  • 5 references for a topic on which there is overwhelming literature sound like very little. I'm sure you've read more than that, otherwise you wouldn't be proposing such a big work. A simple keyword search finds almost a hundred works on Wikipedia (in their title) which mention grounded theory. I'm sure you've read them all, so it would be interesting to know what they're lacking. Don't be shy in commenting the existing literature!
  • Project grants are a very poor organisational setting to conduct research. It's generally very unlikely for such research to be useful, because lone researchers have a much harder job than researchers with some kind of research group to rely on. The failure rate for such projects is sobering. Have you tried proposing this project to some existing research group, university or other, which could host you?
  • If the grantee already has sufficient experience to be their own PI, this doesn't emerge from the text. It would be worth specifying previous research the author intends to build on or which is otherwise indicative of what's going to happen here.
  • It's not at all clear what communities the project intends to address. The title says "Wikipedia", the method mentions Meta for broader transparency, but then there's talk about English Wikipedia policies. How many languages does the author know? What language communities do they intend to observe?
  • The proposed travel costs are unreasonably high.
  • It's a red flag to see several mentions of proprietary software. Its relevance and reasons why it can't be substituted by open alternatives should be made explicit and need to be very compelling, especially as proprietary sources and data are going to make the outputs less likely to be open science. Also, what's the point of following some minor unofficial communication channels while ignoring the main ones, especially IRC and Mailing lists?

Nemo 12:32, 23 February 2020 (UTC)Reply

Hey Nemo, Thanks for your comments. Let me address your comments below:
* Yes! I have read loads more than this. I only included pieces I cite in my proposal. I have a whole folder on my cloud drive with marked-up documents about bias. Certainly, there are many studies about Wikipedia published and some have used grounded theory. Grounded theory is the methodology, not the subject-matter I am studying, so no, I haven't read all the studies about Wikipedia using grounded theory as many of them wouldn't be relevant for this work.
* Thanks for your anecdotal feelings about research and project grants. If there are figures on this, please feel free to share. Hopefully, the Project Grants committee will encourage more research proposals in the future! This has been expressed to me multiple times over the past few years - that research is needed and people are glad to see more project grant proposals about research. I am sorry you do not feel as hopeful about research project grants as I do.
* If there are any concerns about my credentials, my full CV is on my personal website which is accessible via my user page.
* For the language and community pieces, please reread the proposal. If you find this to still be unclear, let me know.
* I appreciate the travel costs seem high to you. Have you calculated what flights and hotel costs may be traveling from the United States to other locations around the world?
* This software is what I am comfortable using. I am going to release the content published in the report and analysis where it would not violate the privacy of participants. About IRC and Mailing lists, I find those to be very homogenously populated forms of communication where not everyone is participating.

Best, Jackiekoerner (talk) 19:20, 12 March 2020 (UTC)Reply

Thank you for dismissing my concerns as "anecdotal feelings". From my point of view, you are the person requesting the grant so the onus is on you to show that the structure and method you have chosen is the best one, or at least suitable. If you need help to do that, feel free to ask and I may manage to, but you have no right to demand that I share my 10+ years experience in Wikimedia grants with you a certain format you prefer. Nemo 21:15, 12 March 2020 (UTC)Reply

Eligibility confirmed, Round 1 2020 edit

 

This Project Grants proposal is under review!

We've confirmed your proposal is eligible for Round 1 2020 review. Please feel free to ask questions and make changes to this proposal as discussions continue during the community comments period, through March 16, 2020.

The Project Grant committee's formal review for round 1 2020 will occur March 17 - April 8, 2020. We ask that you refrain from making changes to your proposal during the committee review period, so we can be sure that all committee members are seeing the same version of the proposal.

Grantees will be announced Friday, May 15, 2020.

Any changes to the review calendar will be posted on the Round 1 2020 schedule.


Questions? Contact us at projectgrants   wikimedia  · org.


I JethroBT (WMF) (talk) 16:30, 27 February 2020 (UTC)Reply

Strategic, but possibly too wide? edit

In general, JackieKoerner does a really good job gathering opinions and facilitating participation of the Wikimedia community. This has been evidenced in her work in both the North American and international communities.

However, I am wondering if this might be too ambitious of a project for one person as it is currently scoped. For example, we already have a substantial body of research, and a research agenda, coming from WMF research on knowledge gaps. @LZia (WMF): and team are exploring that really widely. And there are several studies from WhoseKnowledge and as part of the WMF funded Gender_equity_report_2018 and others. I think there is also a wide body of anecdotal reports/reflections, and/or secondary findings in other research that point at these problems very precisely: i.e. the Movement Organizers report captured the Notability bias problem on English from Ghanaian organizers -- that feeling that the international community does not know how to assess local notability. The on-wiki communities also have a history of not being receptive to broad expert findings in these areas as well -- building consensus requires a long-term engagement on these topics.

I am wondering if the project could be scoped to one subset of the problems, i.e. just looking at the "reliable source"/"notability" biases and identifying effective tactics for building change in the EnWiki commmunity -- Women in Red, WhoseKnowledge and the editor communities and Ghana and Nigerian communities have begun building explict tactics to address these problems, but are still confronted by system-level issues. Also, I think a narrower, more practice focused study could have broad implications for content campaigns and other organizing efforts around trying to address these biases.

Broadly speaking, I am interested in where this project might go, but it currently feels overscoped Astinson (WMF) (talk) 16:25, 3 March 2020 (UTC)Reply

Hey Astinson (WMF)! Thanks for stopping by to review the proposal. This does seem like a lot. I actually reduced this from the proposal I submitted in September 2017. I met with the Community Data Science folks during their retreat in September 2019. There was a project feedback session and I submitted this for feedback. The feedback I received was to scale it back to this. The educational pieces and other curiosities I have about bias need to be other studies for the future.
I know this seems big, but I think it's important to start open to truly find out the direction and needs of the community. For example, one of my past studies about the experience of students with disabilities on college campuses, I thought about just studying students with Autism. I thought studying all students with all disabilities would be too broad. It was actually great! I learned some really unique stuff I did not anticipate learning and used that to change policies as several Midwestern universities.
I understand there is a lot out there on gaps. Some of these studies were very targeted or specific in their search. I want to be open to see what I see. Right now I see policy changes as outcomes of this study, but perhaps it could be something more aligned with cultural shifts, or educational needs. I won't know until I get there. That's the beauty about having such an open mindset on the topic - truly being able to take in what is happening and identify where we need to go next. Best, Jackiekoerner (talk) 19:34, 12 March 2020 (UTC)Reply

Open approach edit

Hi Jackie,

Thanks for caring so much about this topic. I appreciate your efforts in this field and would love to see some clear outcomes on what gaps there are beyond the obvious. I doubt anyone will contest that bias exists, and that it would be great to comprehensively understand it.

I'm a little unclear on what exactly you're trying to accomplish, and what adds value to what other people are already doing or have been doing. I have a few questions that would hopefully help clarify to me (it may well be that this is already clear to others, in which case I apologize for asking the obvious):

  • Which project(s) are you focusing on, if any? Am I correct in interpreting that you will only focus on English Wikipedia? How generalizable do you expect your findings to be? How about the methods, do you think you can capture the same information using the same methods in other language communities?
  • How will you identify an implicit bias? For example, will you use some data driven metric comparing with another source, will you use expert opinion or crowd opinions?
  • How will you determine the policy recommendations? For example, do you plan to design and run experiments, or will you base the recommendations on your reading of the policies, or interviews with experts, or literature, or otherwise?
  • Could you give some directions of recommendations that may come out of your research that would help reduce bias? I'm trying to understand what level of granularity you're shooting for, and what category of recommendations may go beyond what is already out there.
  • How do you plan to determine the impact of bias on "participation, policy, and infrastructure"?

Thanks! Effeietsanders (talk) 01:40, 13 March 2020 (UTC)Reply

Hi Effeietsanders! Thanks for taking the time to respond to the grant proposal. You asked great questions! Let me see if I can answer your questions and use that to clarify my language in the proposal.
* For this first study I will focus on English Wikipedia. This study will serve as a template for others who may want to replicate the study. I’d love to consult with people from other communities who would like to engage with bias in their communities. From what I anticipate, I see the results will be generalizable but perhaps not. All cultures are different in some aspects and the way bias affects those communities may be very different. I think some biases could be present in all communities but how they play out might be very unique.
* To identify bias, I will be using interviews and content analysis. I have not identified the data bank of biases yet. I am anticipating having to test a few in order to see which will garner the most success.
* The policy recommendations will be based on the study findings. I’ll run them by interview participants, adjust as needed, then publish for the community.
* Your last two questions I think can be answered as one. Grounded theory is an open methodology which enables the social situation to be freely studied as it exists. With this study I’m looking to see what is happening. From this information I can determine the impact on participation, policy and infrastructure and what changes should be recommended.

Best, Jackiekoerner (talk) 04:13, 14 March 2020 (UTC)Reply

Neutral approach edit

I think the premise is good but something to think about is thinking within and outside the box when collecting data. For example, can wikipedia overcome this bias within the next ten years or will it be longer. I edit mostly in the Nigerian space, what I notice is a lot of Africans will gravitate towards editing in the English and the French Wikipedia. But we may not be considered the best writers in English or French so there is going to be a minimal implicit bias against ESL editors or people who grew up with Nigerian/pidgin English. A solution can be to create a Nigerian English wikipedia to become the main space for Nigerian related articles, it will be mainly English with some bad grammar. But I do not think that will be a good solution because you are excluding the bias of the Western world which will include those who could contribute more than half of Nigerian articles. From my experience positive editors in African related articles are people who care about African issues or people who have a positive bias towards African issues. I believe encouraging local projects/ finding ways to expand the projects whereby editors who really do not care about African related issues but generally edit on wikipedia can discus any significant edit on talk pages of those projects. Also, moving towards a situation where people who want to make significant edits to an article are encouraged to just go online and read about the subject matter before making such edits. Encouraging more people to have a positive bias towards African issues by reading or engaging with members of the projects could be a way to counter an existing bias. I believe self interest and that implicit bias probably contributed to the development of the English Wikipedia, to have a robust article on other subjects you may need to develop a positive bias towards that subject which may not necessary be neutral to counter any prevailing negative bias.

Hopefully I made some sense. 96.244.13.168 21:17, 13 March 2020 (UTC)Reply

Hi Friend! I really enjoyed reading your comment. Would you be willing to be one of the interview participants if the study is funded? Best, Jackiekoerner (talk) 04:19, 14 March 2020 (UTC)Reply
Sure.96.244.13.168 01:12, 17 March 2020 (UTC)Reply

Aggregated feedback from the committee for Addressing Implicit Bias on Wikipedia edit

Scoring rubric Score
(A) Impact potential
  • Does it have the potential to increase gender diversity in Wikimedia projects, either in terms of content, contributors, or both?
  • Does it have the potential for online impact?
  • Can it be sustained, scaled, or adapted elsewhere after the grant ends?
5.0
(B) Community engagement
  • Does it have a specific target community and plan to engage it often?
  • Does it have community support?
4.3
(C) Ability to execute
  • Can the scope be accomplished in the proposed timeframe?
  • Is the budget realistic/efficient ?
  • Do the participants have the necessary skills/experience?
6.5
(D) Measures of success
  • Are there both quantitative and qualitative measures of success?
  • Are they realistic?
  • Can they be measured?
5.0
Additional comments from the Committee:
  • This project fits with the strategic direction of knowledge equity. The big questions for me are whether the findings will be seen as legit and credible by the community, as well as whether any resulting recommendations can/will be implemented. There is already a good amount of interest and community organizing around this and related issues, which is encouraging in terms of evaluating potential uptake and therefore impact.
  • The project fits the strategic priorities but I don't think that it can produce a longtime impact because it depends a lot on the activity of the proposal. The system of monitoring is quite simple and it is based on a single person.
  • The project fits Wikimedia's strategic priorities but how its results can be sustained or scaled or adapted elsewhere is difficult to estimate because those results are unclear.
  • This is an important issue, but my concern is that the scope and undertaking are inappropriate for one person to take on. A key component of the potential data set (talk pages and discussion pages) is vaguely defined, and, by leaving the selection of pages in one person’s hands, also seems an approach that is subject to individual bias. I see that the grantee welcomes community participation, but my suggestion would be that the grantee prioritize interviews with community members who are active in this area, instead of the review of Wikipedia pages. Focusing on a more specific or narrow area of study (whether specific policies or type of bias) could also help with scoping.
  • Project would ideally be more innovative and interactive.
  • The project is iterative in nature. Its potential impacts are unclear and probably will be quite limited. The main risks are that the planed review of implicit biases of English Wikipedia uncovers nothing new or of interest. So, the final results can look rather banal. The measures of success are practically non-existent.
  • It’s obvious the grantee has relevant Wikimedia experience; she also seems to have the necessary research background/skills. The requested salary seems high relative to what WMF has paid other contractors.
  • Yes, the project can be surely realized but not replicated easily.
  • The scope is actually too broad and as stated can not be accomplished in 12 months - the Wikipedia is too big. The budget is realistic and the grantee probably has necessary skills.
  • Good number of endorsements. I personally would like to see community participation as more central to the work being proposed, e.g. the grantee being supported by a reference group or advisory team would further combat potential issues of bias in the research.
  • There has been none so far but it is planned.
  • The thing about implicit biases is that everyone has them...including the grantee. I'm not super familiar with grounded theory and how the methodology addresses researcher bias, but would like to see more engagement around this issue through certain changes/new measures, such as the inclusion of an advisory team or by adopting an approach that puts more of a focus on interviews (vs review of pages). I would also would like the grantee to consider ways of reducing the budget - this is a fairly costly request relative to other proposals, with uncertain impacts.
  • The project is linked a lot to "solo performance". Basically it's a community management project but I suppose that it can be more efficient if linked to a department than to a single person.
  • I think that this project needs more focus, more realistic goals, and more explicit measures of success.
  • The project lacks focus. The grantee wrote ", I will observe and review talk pages, discussion pages, and other locations where conversations take place between contributors, whether on- or off-wiki. I will begin my focus on topics where bias has an overt impact. This would be, for example, gendered topics. Then I will examine conversation on popular pages, the Village Pump, Articles for Deletion, Requests for Comment, policy and procedure pages, as well as others as deemed appropriate for the investigation." I don't think this is possible. Several discussions happens on notice boards and talk pages at the same time. I am unsure how this grantee would capture all these discussions and other related threads.
  • There is no evidence that communities would follow whatever recommendation resulting from this proposal and if this happens I am not sure how the recommendation would be useful.
  • Aside the handful of supports in the proposal page, there is need for a broad discussion around this proposal on major notice boards on enwiki. This is necessary to gauge the level of support and chances that recommendations would be accepted by that community. Considering all these issues, I am neutral for now and lean towards not recommending this proposal for funding.
 

Opportunity to respond to committee comments in the next week

The Project Grants Committee has conducted a preliminary assessment of your proposal. Based on their initial review, a majority of committee reviewers have not recommended your proposal for funding. You can read more about their reasons for this decision in their comments above. Before the committee finalizes this decision, they would like to provide you with an opportunity to respond to their comments.

Next steps:

  1. Aggregated committee comments from the committee are posted above. Note that these comments may vary, or even contradict each other, since they reflect the conclusions of multiple individual committee members who independently reviewed this proposal. We recommend that you review all the feedback carefully and post any responses, clarifications or questions on this talk page by 5pm UTC on Tuesday, May 11, 2021. If you make any revisions to your proposal based on committee feedback, we recommend that you also summarize the changes on your talkpage.
  2. The committee will review any additional feedback you post on your talkpage before making a final funding decision. A decision will be announced Thursday, May 27, 2021.


Questions? Contact us at projectgrants   wikimedia  · org.


--Marti (WMF) (talk) 01:38, 10 May 2020 (UTC)Reply

Response to aggregated feedback from the committee for Addressing Implicit Bias on Wikipedia edit

I appreciate everyone who took the time to read, review, and provide feedback on this proposal. Your dedication to this project has not gone unnoticed! I have made updates to the project grant proposal to add clarity and provide information where there were gaps. Full disclosure, I submitted the initial project grant draft when I had “a bad virus” which I now suspect to have been Covid. I was not fully coherent or in my regular state for about a month. I hope my edits provide sufficient additional information.

Thank you for recognizing this project as an initiative formed in the vein of the strategic direction. I am very fortunate to have worked on the Wikimedia 2030 Strategic Direction as part of the Community Health Working Group. Issues surrounding inclusion and equity are critical to the success of the movement. Bias is something I have been passionate about for many years now.

In the feedback, one reviewer expressed concern that the study wouldn’t be considered legitimate or credible in the community. I also wonder this. But, the same scrutiny about credibility has occurred about the gender gap and harassment. Those exist no matter if people choose to find them legitimate or credible or not. In our movement we have toxic power structures that function to keep a homogenous group in power. We have to remember a lot of the folks in our community choosing to find information about equity not credible or legitimate are part of the toxic power structure. Operating under confirmation bias - choosing to seek information that fits a personal narrative - can help anyone discredit information if they really don’t want to believe it. If they don’t want to believe bias exists, I may or may not be able to convince them with this project. I am not doing this project for them, but rather the people who are in our movement doing this strong equity work each day.

I am encouraged by the number of people supporting this proposal as well as the 2017 iteration. These are people in the community who have been doing equity work or working on gender related issues and efforts. Often research provides insights into phenomena that are happening our world in order to educate and provide deeper understanding. This proposal aims to do that - give quality research-based information to people who are doing equity work, provide insight to people who want to learn more, and give a roadmap for people looking to replicate or iterate on this study.

I don’t quite understand what is meant by this “is a community management project,” but I can speak to the concern that this is unreasonable for one person. I can see how this is a big project. I am dedicating 40+ hours per week to this project for a period of 12 months. I would love to have a department or group on this, but for now it’s just me. I am tenacious by nature and an advocate at heart. I have no doubt in my capabilities to complete this project. It’s a big project I am passionate about and it has to be done. I also find it interesting the same concerns weren’t expressed about other research proposals dedicating fewer person hours to their projects.

Some expressed my study should focus on interviews. Thanks for being curious about my methodology. I am using interviews, which are amazing sources of information, but I do not just want that to be my only source. Sometimes with research, people who are willing to complete interviews on the topic are already passionate about that topic. I need to be careful to not just capture information from people who are already invested in bias and equity work. I want to identify where bias might be hiding and we don’t already know about it or have an active awareness about it.

In research studies in order to have good validity (strength), triangulation is key. Triangulation means the person doing the study uses multiple sources of information to triangulate the results, so to speak. This helps prevent issues like confirmation bias. That might look like adding undue weight onto interviews the researcher personally identifies with. In this situation, I am using interviews but also document analysis. I have the interviews to provide context, personal experiences, and point to additional sources of information. The documents will provide a historical narrative of these accounts and additional sources of information as well.

In regards to the concern about this project needing more realistic goals, more focus, and more explicit measures of success. I’d like to know more specifics about what realistic goals, focus, and measures of success you’re desiring.

The goal is to collect enough information to make reasonable conclusions as to what is happening. In research the goal is not to collect all information from every possible source. It’s to study the subject until saturation is reached. This means, once I start reading and hearing the same things over and over again, I’ve met saturation and there is little more to be learned. It will be at this point that I consider the project complete and finalize my deliverables.

The great part about this project is my measures of success are already 100%. I will complete the research and provide the deliverables (written report published on meta, recommendations about policies, policy analysis, data resource for further research, methodological design for reuse in other communities, etc.). I have no external factors impacting my capabilities to provide these, well, except for project funding. :)

To the concern about “There is no evidence communities would follow the recommendations.” To that point, there is no evidence they won't!

I agree there needs to be more discussion around this topic to gauge success. Due to the issues surrounding community health, participation, and other concerns, discussions on-wiki are not open to all as we would like. Endorsing my project or participating on the talk page might have negative consequences for some contributors. For example, I reached out on the Wikimedia-research mailing list to share my project proposal and was promptly shutdown by a frequent user of mailing lists. He did not think I should have shared my proposal in that way. No one responded in the thread but chose to email me directly. People in our movement have learned to be quiet in disagreement or face potential retaliation. People reached out to me directly about this project instead of responding on mailing lists or on-wiki. There is community desire for this work on English Wikipedia and other wikis. Due to this project proposal, I am currently emailing with a researcher from another language Wikipedia about collaborating to study bias on their language Wikipedia.

I am aiming at giving evidence and guidance to those who are working to make a change with knowledge and information equity. I am fortunate to have this project receive endorsement and support from many well-known people in the movement currently working on equity issues. That tells me I’m doing something right.

I look forward to your further feedback and response.

Best, Jackiekoerner (talk) 10:30, 23 May 2020 (UTC)Reply

Bias discussion on Jimbo’s Talk Page edit

There’s an interesting discussion happening on Jimmy’s talk page about Wikipedia’s bias. Best, Jackiekoerner (talk) 15:01, 23 May 2020 (UTC)Reply

Round 1 2020 decision edit

 

This project has not been selected for a Project Grant at this time.

We love that you took the chance to creatively improve the Wikimedia movement. The committee has reviewed this proposal and not recommended it for funding. This was a very competitive round with many good ideas, not all of which could be funded in spite of many merits. We appreciate your participation, and we hope you'll continue to stay engaged in the Wikimedia context.

Comments regarding this decision:
We will not be funding your project this round. The committee has expressed appreciation of the value of conducting research into implicit bias in the Wikimedia context, however they described several concerns with the project. Some reviewers expressed that they would need to see more detailed information goals and deliverables that would provide clarity about expected impact. Multiple reviewers expressed concern about the breadth and cost of the project scope and indicated willingness to fund a reduced pilot version of the proposed work. They said they wanted a more concrete project plan, and a description of how the results will be applied. Lastly, some reviewers were concerned about research on bias being conducted by a single researcher and wanted to see incorporation of a review structure, like an advisory group.

Next steps: Applicants whose proposals are declined are welcome to consider resubmitting your application again in the future. You are welcome to request a consultation with staff to review any concerns with your proposal that contributed to a decline decision, and help you determine whether resubmission makes sense for your proposal.

Over the last year, the Wikimedia Foundation has been undergoing a community consultation process to launch a new grants strategy. Our proposed programs are posted on Meta here: Grants Strategy Relaunch 2020-2021. If you have suggestions about how we can improve our programs in the future, you can find information about how to give feedback here: Get involved. We are also currently seeking candidates to serve on regional grants committees and we'd appreciate it if you could help us spread the word to strong candidates--you can find out more here. We will launch our new programs in July 2021. If you are interested in submitting future proposals for funding, stay tuned to learn more about our future programs.

-- On behalf of the Project Grants Committee, Morgan Jue (WMF) (talk) 20:00, 29 May 2020 (UTC)Reply

Return to "Project/JackieKoerner/Addressing Implicit Bias on Wikipedia" page.