Wikimedia Foundation elections/2021/Candidates/CandidateQ&A/Question9
Gerard Meijssen (GerardM)
It is a bias that because of arguments used by the bigger Wikipedias, functionality is not build for the smaller Wikipedias. As it is, no Wikipedia is able to maintain all its lists, lists can be generated and maintained from Wikidata and are often of a higher quality. Basic info boxes and lists may be shown where otherwise there is no information at all providing a superior Wiki support to its public
Dariusz Jemielniak (Pundit)
I believe that the front line of combating bias is actually increasing diversity. The problem is that the existing bias effectively prevents us from reaching diversity by design (in other words, it is difficult to increase it without stimulation). Thus, as expressed in my reply to another question, I find making efforts for increasing diversity important to address this problem. Moreover, our most bias-free articles are the ones that have been edited and read by numerous editors. As I wrote in my book, bias is eliminated through grinding different viewpoints - and for that we, well, need different viewpoints, so even more we need people with different backgrounds. Additional considerations should be made for different forms of bias - for instance, PR agencies "spicing up" their clients' articles, parties or regimes trying to whiten up their pages... Combating bias is not just removing inaccurate information, it is also making sure that our content proportionately reflects the existing positive and negative coverage of the described phenomena. My fear is that so far we rarely face well-coordinated, large groups of experienced editors, attempting to push their biased view on everyone. We should develop better approaches to defending against such strategies, and in my view using global community taskforces (similar to stewards) could be useful. Pundit (talk) 09:48, 8 July 2021 (UTC)
Lionel Scheepmans (Lionel Scheepmans)
Once again, it is difficult for me to answer the question of bias, translated into French by the expression "préjugé", without knowing what exactly we are talking about. There are so many biases and prejudices within the movement that it seems impossible to go into them all.
Reda Kerbouche (Reda Kerbouche)
No response yet.
Rosie Stephenson-Goodknight (Rosiestep)
Bias and diversity go hand in hand. In fact, when I wrote my answer to Question #7, I addressed combating bias. Combating bias starts with the recognition of our own biases; we all have them in one way or another. Combating bias is a circular process. Increasing reader diversity can lead to an increase in contributor (e.g. editors, developers, Affiliates) diversity. Contributor diversity can lead to content diversity (if the contributor feels safe and and feels that their contributions are valued). Content diversity (e.g. more editors working on a single article; more articles regarding a particular topic, etc.) can lead to reader diversity. Policies (e.g. Notability; Reliable Sources) that deal with issues of bias within this circular process are key, bearing in mind that every wiki community is different.
Mike Peel (Mike Peel)
I agree with others that have said the solution to bias is diversity. The processes that Wikipedias use to ensure NPOV generally seem to work, but they always work *better* where there are many editors collaborating on articles (particularly the most controversial ones), and by requiring information to be referenced and avoiding undue weight. I think affiliates (and the WMF to a certain degree) can also help reduce bias by making more content and references available (picking a semi-neutral example: most spacecraft media is from NASA rather than other space agencies, simply due to copyright: this could be less biased by persuading other space agencies to adopt free licenses). Thanks. Mike Peel (talk) 11:52, 11 July 2021 (UTC)
Adam Wight (Adamw)
My opinion is that the "neutral" point of view pillar is one of the best guiding principles of Wikipedia, and is probably more than anything responsible for its success. However, "plural" point of view might better describe the intention. Neutrality and a lack of bias are problematic because they are usually measured against the status quo.
In fact, one of the biggest biases already present is that Wikipedia sources reflect the status quo of scholarly publication, almost 70% of English Wikipedia citations are from the US and UK, and only 0.3% are from the continent of Africa. This is a representation issue entangled with a structural issue. Funding initiatives to diversify our pool of contributors, and broadening the scope of our projects to include more activism, music, even non-encyclopedic writing should help with these biases.
The other insidious type of bias is intentional disinformation. The movement strategy recommendations have identified this as a potential issue, but we need to quantify and measure the current scale of the problem. There has been work done to identify some examples of paid promotional editing, but we need a systematic approach. This type of editing is harmful to our projects because it erodes our credibility and the hard work of volunteers. In the bigger picture, disinformation on our projects is harmful to the world in the same way that propaganda is, and when it's found on our sites it erodes the idea that there is truth anywhere on the Internet. I will counter disinformation by supporting programs to identify undisclosed paid editing, and technical remedies like "blame" which show the parts of articles written by paid editors.
Vinicius Siqueira (Vini 175)
Racial or gender imbalance, for instance, in the content of the projects are related to the low level of diversity in our community. The WMF must support strategies from the community to foster engagement of minorities and neglected groups, increasing diversity and equity of representation. This will have a direct impact on diversity on content coverage.
There is a need of special care to small and more vulnerable communities and observance of organized attempts to capture projects and disseminate disinformation. Investing in innovative and effective tools to help combat disinformation.
I believe that self-governance and dedicated community work on keeping the content aligned with the pillars of neutrality and verifiability make Wikiprojects a relevant player in combating bias, fake news, and mis/disinformation in the Internet. Then, we must preserve these fundamental characteristics of our projects. --Vinicius Siqueira (talk) 06:20, 19 July 2021 (UTC)
Yao Eliane Dominique (Yasield)
It's all about accepting each other's differences. We cannot talk about inclusion in a movement without being able to accept the difference of the other. As far as prejudice is concerned, I believe that an entity of the foundation is well empowered to bring adequate answers and actions. I would not like to make a statement.
Douglas Ian Scott (Discott)
A more diverse community is more likely to detect bias and is more likely to take action to resolve it. However bias has a negative impact on efforts to increase community diversity. In this way bias and diversity exist in a positive feedback loop with one amplifying the other. Supporting the growth of diversity whilst also taking direct action to combat bias (such as friendly space policies or increasing bias awareness) is, in my opinion, likely the best way to approach this issue. Combating bias, much like writing a complete version of Wikipedia, requires a whole of society approach.--Discott (talk) 16:19, 21 July 2021 (UTC)
Pascale Camus-Walter (Waltercolor)
There is a lot that can be done to combat bias in Wikimedia spaces. Education to combat biases as well as more diversity in the social profiles of the contributors may reduce bias. But there may also be other aspects. What is a bias in fact ? Bias comes from the latine Biaxius and means two axes. It can be understood as "squinting", when you don't focus correctly, you don’t have a clear vision, but it also means skillfully, in the sense that you find a way to resolve a problem by the side when you cannot resolve it frontally. I believe that, the more people are efficient, the more they'll use what we could call “operational biases” to go forward faster. In this sense, efficacy may foster biases. Bias is a sort of shortcut you'll use not to lose time or energy. But it makes it difficult for other people and newcomers to follow all these sideroads. That's perhaps why Wikimedia communities, whose efficiency is remarquable, also tend to cultivate biases more than other spaces in the society. So a good way to combat “operational biases” may be to allow a more direct access to the edition of projects by creating efficient supports to simplify the circulation in the labyrinthic rules that prevail especially in Wikipedia (eg : self-appraisal tools, collegial trainings, clear understanding of administration, easy access to lists of WP:XX…) --Waltercolor (talk) 10:30, 9 July 2021 (UTC)
Iván Martínez (ProtoplasmaKid)
I have a satisfactory stance on the neutral point of view of Wikimedia projects which, while I think a fundamental aspect, is also in the permanent call to integrate more people to edit. It would be desirable to know how we are doing on something like the demographic window concept in our contributor base. I'm not sure if we have a sustained rate of attraction and retention of 15-25 year olds in editing.
On the other hand, in today's world the vanguard of social change is being led by those movements that have a strong reflection on the intrinsic majority biases of the world. There are groups of people fighting against linguistic, cultural, racial, gender and other biases, and there is much to be learned from their reasoning. For example, our model of collaboration in the cultural sector defends as a priority from its name the collaboration with museums, galleries, archives and libraries. Such a paradigm defends four centers of cultural production in which other centers of knowledge production that are not such sites and that we probably might not be observing. This is a simple example of how we can combat biases, attracting more social and strategic sectors that can contribute in a more enriched worldview.
Victoria Doronina (Victoria)
The bias exists where there are not enough editors, or the balance is heavily dominated by one side of the debate. Attracting and retaining new editors, making the community friendlier to the newbies, more diverse, will combat the bias. For example, we’ve had this problem with the Armenian-Azerbaijani conflict, where the was a cooridinated one-sided attack by a bigger group of editors who coordinated their responce off-Wiki. We introduced the mediation by the the neutral admins and editors, increased requirements to the quality of the sources, and most of the articles are close to NPOW now.--Victoria (talk) 07:55, 8 July 2021 (UTC)
Lorenzo Losa (Laurentius)
Stated in this way, this question is extremely vague; I will base my answer on the clarification given on the talk page.
The Wikimedia projects have strong tools to combat biases. Neutrality is enshrined as one of the five pillars of Wikipedia, and since anyone can edit, anyone can (theoretically) fix any issue.
This system has proven very effective, and our projects are among the most reliable websites on the internet. However, there are clear limits. First, these mechanisms work with large numbers and with time. This puts at risk the smaller projects, and the pages with lower visibility. We can mitigate this risk by working more as a global movement. The local community of a single project may struggle, but they can tap into the collective experience of the global community.
Second, as we refer to other sources of information, we can inherit existing biases. In a way, this is unavoidable; the alternative, creating our own truth, would be much worse. Referring to external and reliable sources remains the key; and increasing the size and the diversity of our community, in order to reduce the risk of a limited perspective.
Raavi Mohanty (Raavimohantydelhi)
I believe that the biases have not been created overnight, they have permeated the subconscious of the majority every gradually. Both the mainstream and popular culture have only acted as an echo chamber, presenting a one-sided view. The only way it can be undone is to give more diverse representation. A proactive approach needs to be taken to make sure that all the sections have an equal voice. Once the marginalized and underprivileged communities are allowed to voice their opinions freely, various biases would also decline. Raavimohantydelhi (talk) 13:20, 10 July 2021 (UTC)
Ashwin Baindur (AshLin)
Bias will always remain a characteristic of Wikipedia, a perennial problem which arises because it is the work of a large and diverse volunteer community; anyone who so wishes can edit Wikipedia. Some cases of bias are due to points of view of individuals, which manifests in the content and way articles are written. In every project, there would be topics which are covered in detail while other topics have little or no coverage. In case of contents and point of view, the usual Wikipedia processes, of ability to rewrite, to undo, to debate and develop consensus on talk pages, get resolution from noticeboards, and so on, makes that a handle-able issue. In case of skewed coverage, gaps can be pointed out and community encouraged by positive inducements to remedy the gaps.
Systemic biases are harder to resolve, examples being the gender bias, the notability bias, and the ideological bias. Such biases also result in skewed representation despite the diversity, under-representation and wide gaps in knowledge on certain topics, but more importantly, they may result in unfair and uneven application of policy for new article deletion, different standards of notability, hostility to points of view, excessive rule-making, and even edit warring.
Biases arising from this nature of Wikipedia can be engaged by greater awareness, pointing out behaviour that is restrictive, abusive or prejudiced. Reasoned discourse, and encouraging positive forms of social behaviour are useful. Positive discrimination in outreach events could also help in getting under-represented voter communities to join the movement.
Awareness of such bias, informing and educating the Community through discussions both online and offline is important because often bias arises from ignorance. The Community needs to be provided informatory resources for the same, in which the WMF staff can play a role.
However, ensuring a safe environment and providing safe spaces is very important. There must be designated safe places where respectful, discourse can take place, where the unpopular opinion is not punished for being articulated, and where ad hominem attacks are not tolerated, not just in Wikimedia project spaces, but also in social media forums on external platforms. If there are such places, then honest debate can be done, without harassment of anyone for their point of view, and greater understanding and/or even consensus could be reached on these issues.
Language environment is a huge bias. While certain Wikimedia projects run fully in the native languages of community, important projects such as Wikimedia Commons, Wikidata, even Wikimedia Meta-Wiki, are all in English language. Discourse across Wikimedia movement is dominated by English language as the medium. The internet itself is predominantly English-language based. Making important standalone projects inter-operable in languages other than English is the first step.
Resource bias across communities, amongst editors of a community, amongst the projects run by a community are all important biases. This would be partly alleviated by the creation of hubs with dedicated budgets. However, a new look at how resources are distributed across communities and projects is necessary and should guide future policy at WMF. The Board of Trustees has a role to play in this.
Research in aspects of systemic bias is important so that any discussion or action could have a basis on facts, rather than only on anecdotes. Wikipedia Research Team is useful for this.
The biggest challenge involving bias, however, is that of communities which are dominated by ideological factions that conspire to destroy the fundamental principles of Open Knowledge in general, and Pillars of Wikimedian culture in particular, which hijack the content and the editing processes, where online and real world persecution happens of the dissenting editors, and where such suborning of Wikimedia projects has the sanction, encouragement and protection of governments. This requires a lot of discussion and analysis as it requires collective action of Global Community, that language/project’s community and the WMF for any form of resolution.
It is here that the Board of Trustees has an onerous responsibility and which must institute an initiative to address the issue. If elected I pledge to work towards the resolution of bias on Wikimedia projects on priority.
Pavan Santhosh Surampudi (Pavan santhosh.s)
When a Wikipedia community is small and populated with people sharing the same backgrounds, then bias is inevitable. On the flip side, If we could be able to improve diversity and invest in community growth then bias would be tackled well within the project. In addition to diversity and community growth, providing tools, investing in better skills and research will support the community to handle this battle well. --16:40, 18 July 2021 (UTC)
Ravishankar Ayyakkannu (Ravidreams)
There are two types of biases on Wikipedia:
- The bias flowing from the inequitable world in which we live.
- The bias sustaining because of our project and movements’ design, policies, and decisions.
While we cannot totally eliminate the first type of bias which is a socio, economic, and political problem, we can certainly take some concrete steps to eliminate the bias inherent to our projects.
We need to invest in developing tools to identify bias, bring and retain more new users, and enable the communities to discuss and refine content policies.
Farah Jack Mustaklem (Fjmustak)
Bias is a very complex issue to overcome. There is a bias in every aspect of the Wikimedia Movement: from gender bias, to linguistic-based bias, to Global North/Global South bias. The Wikimedia Foundation and the Wikimedia Movement are pioneers in countering these biases, but there is still a very long way to go. Encouraging worldwide participation in the knowledge-creation and decision-making processes should serve to increase diversity and lessen this bias. --Fjmustak (talk) 23:59, 31 July 2021 (UTC)