Wikimedia Foundation elections/2024/Questions for candidates/Question 5

What are your thoughts about systemic bias on Wikimedia projects, both in their content and their demographics, and including identity-based, language-based, economic/resource-based, ideological/worldview-based, and other forms of system bias? What measures or initiatives do you think the Board can appropriately take to address systemic bias?

Bobby Shabangu (Bobbyshabangu)

I believe systemic bias remains a significant challenge for the Wikimedia movement, particularly affecting underrepresented and Global South communities. While I have seen the support that WMF has provided to these groups, I think the focus should be on the movement itself. We need to activate and retain new users, we need to be comfortable with new type of knowledge that we are not familiar with. We also need to give incubator Wikimedia our full support (now it looks like a "by the way" project), and I think WMF should invest in tools and training to help contributors recognise and mitigate their biases and ensure that Wikimedia's Friendly Space policies and UCoC are effectively enforced to protect particularly underrepresented communities, newcomers, and those from the Global South. We need to foster a culture of continuous reminder, education, and support for users (both experienced and newbies). Over the years I've witnessed a lot of overt and subtle harassment within the movement, really UCoC needs to have teeth if it we are intentional and serious about being inclusive.

Deon Steyn (Oesjaar)

Systematic bias in Wikipedia projects affects both content and community demographics, manifesting in identity-based, language-based, economic/resource-based and ideological/worldview-based forms. Addressing these biases are crucial to ensure Wikipedia projects reflect the diversity of human knowledge and perspectives. Bigger groups dominate, think of the power of the English language in Wikipedia.

To address these biases, the Foundation should diversify the contributor base through outreach programs and partnerships with educational institutions. Supporting diverse languages and cultures with tools, translation resources and cultural sensitivity training is essential. Resource allocation through grants, scholarships and infrastructure investment is also vital and also important: the returns on these investments must be measured.

Community guidelines and policies must be strengthened, providing even better translation tools and assist to create specialist dictionaries in smaller languages Wikipedia's. Anti-harassment measures and bias detection tools are necessary to create a welcoming environment.

Erik Hanberg (Erikemery)

There are many forms of systemic bias that exist throughout our global society and thus, are present in Wikimedia projects as well, as cataloged on the Systemic Bias page of Wikipedia. People from the Global South, women and are often underrepresented in favor of English speakers, men, and people from the Global North. I recognize that I am one of those individuals and carry a degree of privilege because of it. Part of my work in life is to carry that recognition with me and help other privileged folks to see it as well.

One place the board might contribute to removing systemic bias from its platforms and services is through commissioning an annual or bi-annual report card on bias from a third party. This review of Wikimedia projects might help us see places for improvement, and give the community a regular snapshot of how we’re making progress on these goals, as well as resources and tools to fix these issues moving forward.

Farah Jack Mustaklem (Fjmustak)

Systemic bias is an issue we, the Wikimedia Movement, have grappled with ever since the inception of Wikipedia, and continue to this day to work to counter. It is a topic that is very important to me personally. The biases manifest themselves on different levels. One of the identified issues is gender representation. For example, less than 20% of biographies on the English Wikipedia are of women. There are initiatives to shed light on and bridge this gap. Similarly, there is a heavy English language bias: While English is the most widely-spoken language in the World and is considered the most universally understood working languages, this favors native English speakers. An example is that "Wikipedia" is often synonymous with "English Wikipedia". The "Global North" is also heavily represented. Western ideals, sources, and even sometimes people are placed on a higher tier than the rest. The inherent human bias is clearly evident in the makeup of the Board: there is not a single affiliate/community member from the "Global South", for example. The will is there to make a change to counter this systemic bias, not least in the strategy recommendation to "Ensure Equity in Decision-making", and is commendable. What is often done to counter these biases is to look at the symptoms and not the root cause of these biases. For instance in terms of sources, as Wikipedia (in its different language editions) relies on written sources, it is much more likely to find information written by Western men about Western men. Additionally, Wikipedia (English and others) tend to favor Western news sources - bias and all - over others, thus spreading the prevailing narrative in these sources as "fact". As for the community, the Wikimedia Movement community is more concentrated in the more affluent "Global North" (and mainly male), which can affect a (understandable) sense of entitlement from those who make up the majority of contributors (financial and otherwise) in shaping the movement and the content in the projects. What the Foundation can do is to keep acknowledging this systemic bias exists, to listen to those subject to the biases and get their input on how this bias is perpetuated, and provide them with resources to counter it.

Christel Steigenberger (Kritzolina)

This is a again question on a topic that is not only important for the movement as a whole, but also dear to my heart. I believe that we need a healthy and acitve community that is as diverse in all aspects of identity as possible, to bring those different biases to a shared table – the reality reflected on our projects. Only by inviting as many viewpoints in as possible, by discussing and reflecting about all the different perspectives on the world and all there is to know about it, can we continue to hold up the commitment of freely sharing the sum of all knowledge with every single human being.

I believe it is a core responsibility of the Board to develop strategies on how we can bring this diversity to our communities, where it doesn’t already exist. Strategies that encourage participation of voices that are not yet heard loud and clear. At the same time we need to plan and strategize on how to get some perspectives out of the limelight, so other viewpoints can be seen more clearly. All while making sure the knowledge publicly shared is reliable, well sourced and trustworthy. This is not an easy task, but one I have many ideas about that I would like to work on as a Board member.

Lane Rasberry (Bluerasberry)

The most important bias is financial bias, but currently the Wikimedia community has no way of reviewing Wikimedia Foundation systemic bias in our investments. We have fundraised and spent US$1,000,000,000 in the past few years, and will spend another billion dollars soon, but the Wikimedia community does not have the documentation it wants to be able to understand and discuss this spending. Part of our challenge in addressing systemic bias is that we do not have budget estimates of Wikimedia Foundation investments based on identity, language, economic status, worldview, and other biases that we identify as strategically important. An even bigger problem is that there is a taboo on asking about budgets, which is why we do not have this information already.

I do not believe that we can address bias without being aware of our financial investments, and I want to make Wikimedia Foundation budgets more accessible to the Wikimedia community, to journalists, and to university researchers. We need independent third-party research to survey and report our spending. We need to know spending by country, and spending by demographic.

Lorenzo Losa (Laurentius)

A cornerstone of most Wikimedia projects is the neutral point of view. It provides foundation and grounding for the editing work. Striving to avoid biases in content is a natural consequence of this basic principle. Editing communities are, in general, well-equipped to deal with most non-neutral content: it's one of the key skills that every new editor has to learn. Systemic biases, though, are a particularly challenging category of biases to deal with, and we are not always doing enough to limit their influence.

There are two main parts to what we can do to counter systemic bias. The first is to be mindful of our own biases and the conditions that cause them. For instance, most of us have spent most of their life in one specific country, and there is nothing we can do to change that and the fact that this brings a limited perspective on the world; but we can be mindful that our own lived experience is not the same as everyone else.

There is a limit to what we can do individually, and the other key part of what we can do is to support a more diverse participation, especially in the editing community, so that we can complement each other. This is easier said than done, especially when the problem is rooted in an imbalance or asymmetry in the world; but still, actively trying to safeguard diversity can go a long way.

Maciej Artur Nadzikiewicz (Nadzik)

There is bias on Wikimedia projects. This is one sentence that we can be certain of. It is gender-oriented, language-oriented, and takes many other forms. For a long time, the Wikimedia Foundation itself was aiding this bias by contacting the communities only in English and making decisions in an SF-centered way. This has changed to some degree, but there is still room for improvement.

I am a big fan of evidence-based policy-making. As one of the other candidates is saying, we cannot change something if we don't know the extent of the problem. We need to measure and learn about the problems we have to deal with. Some of this information is available, either measured by the different Wikiprojects (Wiki Loves Monuments has amazing data on the monuments and their states) or by external research (mostly gender-based and language gaps). We have to invest more in researching our own shortcomings. It may be uncomfortable, and it may be hard at first, but this approach will eventually lead us to know more about ourselves and our problems. Only then can we start working on them effectively, with a targeted approach.

Mohammed Awal Alhassan (Alhassan Mohammed Awal)

Systemic bias on Wikimedia projects is a significant issue that affects the quality, diversity, and inclusiveness of the content and the community. Addressing this bias is necessary for ensuring that Wikimedia truly represents the sum of all human knowledge. First of all, there is an unbalanced amount of content in certain languages, particularly English, while other languages, especially those of smaller or less affluent communities, have significantly less coverage. To get this addressed, the Foundation should be interested to invest in programs to develop and support Wikimedia projects in underrepresented languages such as translation initiatives and technological support for content creation in these languages. This will ensure that localization of content is possible by encouraging the creation of content relevant to local cultures, histories, and perspectives to ensure a diverse and inclusive knowledge base. Another way the Foundation can tackle the systemic bias is to ensure that the governance structures of Wikimedia projects include diverse voices. This means having diverse representation on boards, committees, and leadership positions. There should also be an established feedback mechanism to listen to and address the concerns of underrepresented groups within the community. Also, access to resources such as research materials,reliable internet,, and editing tools is uneven, and therefore disadvantages contributors from lower-income regions. Foundation may deliberately provide special grants, scholarships, and other resources to support contributors from marginalized communities and to ensure they have access to the necessary tools and training.

Rosie Stephenson-Goodknight (Rosiestep)

“If you can’t measure it, you can’t improve it.” Let’s start there. Where we have measured a form of systemic bias, let us work systematically in researching options to improve on it. I have great expertise in gender gap systemic bias, particularly women as readers, women as editors, and women’s representation (biographies, works, issues).

In 2015, I co-founded Women in Red (WiR) (Note: redlinks on Wikipedia lead to nothing, ergo, the name of our community). WiR focuses on “moving the needle” in terms of percentage of biographies about women on Wikipedia, from 15.5% (October 2014) to 19.8% (June 2024). There are 34 other language Wikipedias that engage in WiR work. In 2017, I conducted 65 interviews with Wikiwomen from around the world in the first-ever Gender Diversity Mapping project. In 2023, with two others, I was part of the Research cohort of WikiWomenCamp where we systematically catalogued research related to Wikiwomen. Since 2023, as a member of the WMF Board of Trustees, I was able to make the case for including the gender gap as a strategic priority and since then, it has been included in the Annual Plan.

I recognize that there are many other forms of systemic bias, including intersectional ones, and I have been fervently committed to listening, learning, mentoring, and advancing improvements to the extent that I am personally able to do so.

Tesleemah Abdulkareem (Tesleemah)

Systemic bias is not entirely the foundation's fault as over time, there has been measures put in place especially in the aspect of gender as more project about women and LGBT+ has been approved and organised within the organisation. However, one systemic bias I feel is still lagging behind is demographic bias, there is still underrepresentation of less developed region, Africa as an instance. The board can come in through the community Affair committee by holding sessions with affiliates within this group, provide more support and ensure there are specific policies that allow them to thrive.

Victoria Doronina (Victoria)

Wikipedia and other Wikimedia Projects are created mainly by young white men from the Global South, and it shows. For example, the article about virginity in Russian Wikipedia stated - without citing sources - that virginity as a concept is applicable only to women, which is simply not a fact. The people who had more resources and “got there first”, overthrew the “old gatekeepers” became the “new gatekeepers”.

However, the Internet has revolutionised access to information resources and audiences. Wikimedia projects are empowering minorities, including women and other underrepresented minorities. WMF is working on reducing the gender gap in our content, which in turn should reduce the readership and editors gap. The Universal Code of Conduct should help to reduce the level of harassment of minorities. Several WMF initiaties are aimed at the Global South.

As a Trustee, I try to lead by example. I took part in the biannual WikiWomen+ Camp in Delhi in 2022, where wiki-women networked and got training. The other female trustee that was there, Rosie Stephenson-Goodnight and I had a trustees Q&A session where we told our life stories and encouraged the participants, especially from the Global South, to take a more active role in the Movement governance to become more visible. I’ll take part in a WikiWomen Summit before the Wikimania in 2024.

In general, it's great that we have 4 female candidates in these elections, which is 25% of the total. One of them is from the Global South. I hope that the voters will help reduce the systemic bias in the movement's governance structures by voting for them.