Grants:IdeaLab/Reveal the Numbers Behind Edits to Increase NPOV Transparency

Reveal the Numbers Behind Edits to Increase NPOV Transparency
Through the use of statistical data models, implement or promote an interface by which is revealed gender trends in editing history in order to make any existing biases more apparent and thus increase communal accountability.
idea creator
Brickbeard
researcher
Xttina.Garnet
this project needs...
volunteer
developer
community organizer
join
endorse
created on05:00, 12 March 2015 (UTC)


Project idea edit

What is the problem you're trying to solve? edit

Women and other political minorities are continually discouraged from editing because it's an uphill battle against the loudest, though not necessarily most impartial voices.

What is your solution? edit

It's hard for people to step out of their own shoes and view their biases impartially.

Revealing trends like the reactionary rejection by male editors of edits made by females may also elucidate the nature and frequency of such behaviors to those who actually engage in them.

Employing simple data visualizations that the editing community can view may induce editors to develop in more conscientious attitudes about their own perspectives.

This idea depends partially on editor self-identification.

Goals edit

Get Involved edit

Participants edit

  • Researcher I can do advanced statistics and help with design and interpretation. Xttina.Garnet (talk) 17:35, 24 March 2015 (UTC)

Endorsements edit

  • I'd like to see a visualization tool for gender of contributors to articles. I'm not sure how useful this would be, however, because of how few people mark their gender relative to all contributors. Ocaasi (talk) 00:40, 20 March 2015 (UTC)
  • Interesting proposal and I love the idea of trying to induce, or at least support, reflection. I also agree that it may be difficult to pull off because of reliance on self-identification but, hey there may be other, creative ways of trying to capture this information. -Thepwnco (talk) 21:22, 20 March 2015 (UTC)
  • There's objective and then there clueless: I just edited an article on rape culture and in the talk section I noted that while it's true that it's a theory, I could not find other wikipedia articles on academic theories that pointed out they were a "theoretical concept" in the first sentence. Xttina.Garnet (talk) 17:21, 24 March 2015 (UTC)
  • I endorse anything that will defang aggressive gate-keepers. Doing so will also help male editors, after all. Even if the aggressive Wikiers still refuse to reflect on their actions, objective data can be used to demonstrate bias and instability during reviews and disputes. Thank you, Wordreader (talk) 19:41, 24 March 2015 (UTC)
  • The only way to accomplish this is to REQUIRE users to indicate their gender in their profile, which could be controversial. In order to protect privacy, users would need to be able to opt out of having their gender appear to the general public on their profile. Ask the question: Whoul should be able to see your gender on your profile? Everyone, No one, some special group... Meclee (talk) 18:15, 25 March 2015 (UTC)
  • Dealing with trolls of whatever sex/gender has to be the goal; obviously most are male. See Grants:IdeaLab/Controversy_Monitoring_Engine which is a more gender neutral proposal. Some combination of ideas in both of these should be a high priority. Carolmooredc (talk) 19:23, 28 March 2015 (UTC)
  • This sounds great. I'd like to think the results will bear out that editing is actually less intimidating for women than the media makes it out to be, but if not it should help the community better identify problems. Having a way to gather this data should help the movement for a long time to come. La salonniere (talk) 00:15, 31 March 2015 (UTC)

Expand your idea edit

Do you want to submit your idea for funding from the Wikimedia Foundation?

Expand your idea into a grant proposal