Community Wishlist Survey 2022/Larger suggestions/"POV Detector" for articles

"POV Detector" for articles

  • Problem: Wikipedia (among some other Wikis) have the strict Neutral point of view (or NPOV) policy, but there are some users that don't know it very much, and the articles have bias in, and it difficulties to find an "unbiased" article in whitch complies with styling guidalness.
  • Proposed solution: As my proposal, I would make an extension specially designed for Wikimedia foundation Wikis (in which have a NPOV Policy) called "POV Detector" (or "Point of view detector"), which activation will be optional by the user (default activated for new users (and/or anonymous). And it will consist in the following: If the system detect a bias when finally editing, will detect if there is any biased information in the article and violates the NPOV policy.
  • Who would benefit: Would benefit in the articles articles quality, detect biased data, and it will raise Wiki's users abilities.
  • More comments: The "POV Detector" will appear in a yellow-like colored windows when the user finishes the article.
  • Phabricator tickets:
  • Proposer: Alejitao123 (talk) 21:31, 14 January 2022 (UTC)Reply[reply]

Discussion

  • Seems like a decent idea in theory. After all, such tools are widely used by online community managers to detect abuse. --Rollo (talk) 19:09, 16 January 2022 (UTC)Reply[reply]
  • There is a lot of research on this that could be used, but I suspect this is outside the size that CommTech would like to deal with. --Izno (talk) 21:11, 18 January 2022 (UTC)Reply[reply]
  • Hello and thanks for taking the time to write this proposal. We reviewed this proposal as a team and have determined that this is out of scope for our team but an idea that's valid nonetheless. What is being suggested here would require elegant machine learning, which our team does not specialize in. Good idea in any case, I am therefore moving it to the Larger Suggestions Category. Thanks again! Regards, NRodriguez (WMF) (talk) 18:27, 26 January 2022 (UTC)Reply[reply]
  • How on earth should a bot decide over content issues? (N)POV is something completely about content, nothing about technicalities, this should (and could) only be decided by real people, not some bot. Even the use of bad words (which has nothing to do with POV but is content as well) is something, that a bot cannot do properly, as there as well are too many ambiguous uses if words, and too many circumventions possible. This is an impossible suggestion. Grüße vom Sänger ♫(Reden) 05:32, 3 February 2022 (UTC)Reply[reply]

Voting