Requests for comment/Image filter on Wikisource

The following request for comments is closed. No activity since 2011 when it was still in its preliminary stages. Should somebody wish to revisit this matter, please start anew. Snowolf How can I help? 06:07, 21 July 2013 (UTC)[reply]


Problem description edit

The Board of Trustees of the Wikimedia Foundation (WMF) has at the end of May 2011 in the course of a Resolution: Controversial content[1] decided to develop a software application for personal optional filter for images and to implement it on all its projects:

“We ask the Executive Director, in consultation with the community, to develop and implement a personal image hiding feature that will enable readers to easily hide images hosted on the projects that they do not wish to view, either when first viewing the image or ahead of time through preference settings.”[2]

This is grounded inter alia on results and recommendations from a study on behalf of the Wikimedia Foundation from 2010, known as the Harris-Report.[3] The Harris-Report had proposed a form of personal image filtering in its recommendations 7 and 9.[4]

The proposed filter and filter categories and the method of implementation are disputed among the users of Wikimedia projects.

  1. Wikimedia Foundation Board of Trustees: Resolution:Controversial content, 29. Mai 2011
  2. Wikimedia Foundation Board of Trustees: Resolution: Controversial Content, 29. Mai 2011
  3. Robert Harris and Dory Carr-Harris: 2010 Wikimedia Study of Controversial Content, 2010
  4. “It is recommended:
    7. That a user-selected regime be established within all WMF projects, available to registered and non-registered users alike, that would place all in-scope sexual and violent images (organized using the current Commons category system) into a collapsible or other form of shuttered gallery with the selection of a single clearly-marked command (‘under 12 button’ or ‘NSFW’ button). […]
    9. That a series of additional user-selected options be created for other images deemed controversial to allow registered users the easy ability to manage this content by setting individual viewing preferences.”,
    Robert Harris und Dory Carr-Harris: 2010 Wikimedia Study of Controversial Content:Part Two, 2010

Planning by the Wikimedia Foundation for the implementation edit

The precise function of the image filter is not fixed yet. So far the developers have created an informal working draft with sample pictures. The following is currently being planned:

The filter is designed to enable readers of Wikipedia, to not show files on their own screen at their own request according to specific criteria. The filter settings can be personally adjusted. Files with potentially controversial content will get an additional button “Hide image” to be able to change the filter settings so that the file is hidden. Files that are already hidden by the filter will get a button “Show image” to change the filter settings and then show the file. In addition, the page header always displays a link that leads to the filter settings. Although the Foundation speaks of an "image filter", according to the rough draft all categorized media will be included, e.g. video or audio files.

To enable the filtering, the readers of Wikipedia shall be able to specify criteria like “sexually explicit”, “medical” or “depiction of violence” for the files they don't want to see.[1] The files shall be filtered using corresponding filter categories to be created within the existing system of categories on Commons.[2] Corresponding filter categories shall be set up on the local Wikimedia projects, such as on the German Wikipedia, for the locally hosted files.[3]

In the period from 15 August to 30 August 2011 the Wikimedia Foundation held a Wikimedia-wide referendum “Image Filter Referendum”, to gather opinions on the type of design and use of the filter function to be developed. Eligible to participate in the referendum were registered authors with at least 10 contributions, developers of MediaWiki, staff and contractors of the Wikimedia Foundation, and members of the Board of Trustees and the Advisory Board of the Wikimedia Foundation. To six statements, they were able to submit a numerical vote.[4]

The possibility to explicitly vote for or against the implementation of the image filter function was not provided in the referendum. Board member Samuel J. Klein, who co-authored the questions, later stated this should have been included.

  1. example figure from the rough draft
  2. Commons:Categories
  3. Category Equivalence Localization
  4. On a scale of 0 to 10, the voters can specify the degree of their approval of the following statements: It is important
    “for the Wikimedia projects to offer this feature to readers.
    that the feature be usable by both logged-in and logged-out readers.
    that hiding be reversible: readers should be supported if they decide to change their minds.
    that individuals be able to report or flag images that they see as controversial, that have not yet been categorized as such.
    that the feature allow readers to quickly and easily choose which types of images they want to hide (e.g., 5–10 categories), so that people could choose for example to hide sexual imagery but not violent imagery.
    that the feature be culturally neutral (as much as possible, it should aim to reflect a global or multi-cultural view of what imagery is potentially controversial).”
    Wikimedia Foundation: What will be asked

Proposal edit

Personal image filter (filter which hide images on the basis of labels or categories; see the preliminary Design of the Wikimedia Foundation) despite the decision of the Board of Trustees of the Wikimedia Foundation should not be implemented on any content page of any Wikisource project, and nor should filter categories for files stored locally on any Wikisource project be set up.

Arguments edit

See also German Wikipedia poll (de; en)

Arguments for the proposal (i.e. against filter) edit

  • Wikisource is a free digital library, and librarians reject labeling of content which can be used for censoring.
    The image filter requires the creation of prejudicial categories; the ALA defines these as "censorship tools"[1]: the prejudicial categories can be read and misused by third parties (ISP, local networks, etc.) to filter out content they see as controversial, removing choice from users.
  • Wikisource mission is to provide faithful high-fidelity reproductions of published works. Hiding parts of a work is contrary to the Wikisource definition of NPOV.
  • Wikisource content is primarily old public domain works, which have already been subject to editorial control and/or censorship at the time of original publishing.
  • Wikisource is a separate project, and its content should not be subject to labeling systems primarily promoted, constructed and maintained for the purpose of limiting readers exposure to controversial content on user generated content projects like Wikipedia.
  • Depictions of violence or explicit sexuality are found in textbooks, including school textbooks. This is didactically not disputed.

Arguments against the proposal (i.e. pro filter) edit

  • Wikisource is a worldwide project so it should respect the cultural norms of the whole world, including those located outside of Europe and North America.
  • Wikisource appeals to a large audience that has different standards for content and it would be wrong to deny someone written portions of a text because of image related concerns.
  • Wikisource deals partially with texts that have been entered into the public domain because of age, and many works that may contain objectionable content would also have versions that do not.
  • The front end for the filter is opt-in, so the users are given a choice if they want to personally see material or not. (Some back-end elements of the filter scheme are mandatory, however, and cannot be avoided.)
  • Giving users the ability to hide images on an individual bases in no way takes away from the overall quality of the content on Wikisource, as the content is still fully available to anyone who wants to view it.
  • This would be no direct censoring, because censoring is deleting. The filter doesn't delete, it only voluntarily hides.
  • While it is within both Wikimedia and Wikisource's philosophy to provide complete information, some viewers anticipate presentation with least astonishment and might not visit the site without filters. Enabling this on Wikisource means that Wikimedia's goal of providing sources to everyone is furthered by allowing users to access areas of the site which they would not otherwise go to due to conflicting beliefs and values.
  • Wikisource is open to all members of the Wikimedia community, but opposing the WMF resolution makes the project less welcoming to those who agree with the WMF resolution, further isolating the Wikisource community.
  • The WMF proposal is opt-in, so the users are given a choice if they want to personally see material or not. By denying the WMF proposal, Wikisource would deny its users from having a choice and imposing a set of values that may contradict with individual values.
  • Wikisource is a valuable tool that could provide information not accessible to millions of school children. By not providing a filter, Wikisource would force some schools to choose between respecting their laws or having access to content.
  • The WMF and its projects is not to promote a particular legal philosophy nor as a proponent of "free speech". Claims of censorship of others (indirect censoring) by using the WMF filter outside WMF do not apply to the WMF nor is it our mission to fight censorship of others.

Arguments that could go both ways edit

  • Wikisource is a valuable tool that could provide information not accessible to millions of school children. By not providing a filter, Wikisource would force people to provide access to all content unaltered, or provide no access at all. This has been the standard MO for all wikimedia projects so far. In most cases, this has embarrassed (POV) censors, and furthered our mission of providing NPOV information.
  • It has always been part of editing in all projects to filter relevant and main information from irrelevant, inappropriate and not so important information. While pictures often are a means of bringing information to the point they also in most cases are not necessary to understand a text. While editing, editors always have to think about which, how and if they include pictures. So this kind of filtering is already part of creating content in most WMF projects.
  • Providing a filter with openly accessible categories could convince third parties to remove a block to Wikisource, but it could also convince third parties, which did not block Wikisource at all, to use the categories to restrict access to content which has been labeled as controversial.