Talk:Image filter referendum/Archives/2011-08-22

Stay focused

The wmf:Resolution:Controversial content lists following reasons why an image might by filtered:

  • Images that are offensive because of their sexual, violent or religious nature.

As we see in the discussion on this talk page, the third point, whether an image with relation to religion is offensive, is difficult to say. It is even more difficult to put pictures into "political offensive" or "cultural offensive" categories.

So let's please stay focused on the realistic goals: Optional filtering of sexual or violent images. Similar to the Google SafeSearch, which filters sexual content and is activated on default. --Krischan111 00:18, 17 August 2011 (UTC)

Even sexual related content is already an very hot topic, with very different view angles. See two paragraphs above. --Niabot 00:24, 17 August 2011 (UTC)
They mean "Muhammad images". Probably people could name other examples, but only as cover. Wnt 05:59, 17 August 2011 (UTC)
Offensive because of religous nature? Who knows what could be offensive to anyone? I remember that there is a principle like NPOV. Each decision about a picture whether it can be offensive because of religous nature is personal opinion. So we can easily see: It must be possible Either to filter all pictures or no pictures. Anything else is not NPOV. --78.94.169.111 00:16, 18 August 2011 (UTC)

Impractical and Dangerous

Well, I was going to say "stupid and dangerous", but I decided I'm a nice guy. No, but seriously, it is stup...I mean, impractical because

  • It puts strain on the already overstrained servers; this feature is probably very software-intensive. We already have bimonthly panhandling from Jimbo to improve the servers; there is a limit to how much people are willing to donate, and I don't think this limit is significantly increased by an "opt in" image filter.
  • It puts strain on the already overstrained editors: now, on top of adding content, fighting trolls, shills, zealots, cranks, vandals and cud-chewers and trying to enforce and discuss, literally, hundreds of policies, they shall also decide the merits of tags.

It is also, in spite of what optimists, outraged citizens and do-goody-goodies say, dangerous, because:

  • The slippery slope is there. Wikipedia has bent to pressure before, most notably when it implemented a BLP policy that differs from the policy applied to all other articles. There are hundreds of well-argumented pages with buzzwords like "responsibility" to justify it but the bottom line is: living people litigate. Now, pretty much everything on the Internet is subject to the Maude Flanders attack, which consists of screaming inconsistently, loudly and repeatedly "Will anybody think about the children?" and demanding that policies are implemented to protect "them" from the evils of...whatever bigots think is evil. An uncategorized encyclopedia is quite impervious to this, as it is very hard to considered it offensive as a whole. Once parts of it are tagged, it's a breeze to demand that they are censored, "for the children's sake". Or to "safeguard religious sensitivities" other than the single reader's. Or to "comply to a nation's governing law". Or to abide to special copyright laws.
  • There is not enough detail on the technical implementation so far, but it is entirely possible that a firewall could be easily configured to force filtering, or to directly filter. This would mean wikipedia editors and programmers would be working to make things easier for the Great Firewall of China and the Saudi government.
  • Assuming the above is not true, there are still two possibilities: people will not use the image filters, in which case its implementation is useless, or they will. If they do, censorship will simply consist of mass tagging unwelcome images with impopular tags, most likely "pedophilia" or "possible copyright violation". Should I remind you that a consistent part of the editors is made of shills of corporate, governmental or religious organisations?

Well, these were my two pennies; in the end, all policies are voted in, as the enthusiastic are, by definition, more active than the cautious. In six months' time, I'd like to point to this discussion and say "I told you", but truth is, everybody who'd care will be too disgusted by then. 77.233.239.172 09:07, 17 August 2011 (UTC)

  • While I don't agree about the parallel with BLP (there is a substantial difference between trying to prevent readers being disturbed by imaginary offense in patterns of colored dots and preventing real harm to third parties), those are all very good reasons why this idea should be scrapped. — Coren (talk) / (en-wiki) 11:08, 17 August 2011 (UTC)

How will category pages and galleries look like ?

Image filter referendum/en#What will the image hider look like? tells how it looks on article pages. How would it look on a category page? For example how would commons:Category:La Liberté guidant le peuple (a famous painting by en:Eugène Delacroix) look like for people who want to remove images of topless women (or commons:Category:Auguste Rodin for people who want to see only the non-nude portraits) ? The size of the "show image" button added to the length of the "filter settings" link seems to be too much to fit the tiny space allowed by tiny thumbnails on category pages. By the way, the same size problem might exist on galleries using the < gallery > < / gallery > tags. Teofilo 09:14, 17 August 2011 (UTC)

Some Questions on the "Editors support"

The section "Background" reads:

"We will also make it as easy as possible for editors to support."

1. To start to talk about what the editors will do there, could we call it by another, perhaps more neutral term then "support"? What about "classify" or "categorize"?

There are image files on the local projects and image files on Commons. For some images, there are versions on different projects.

2. Will the filter allow to see any file that is not from Commons?

3. Will there be a classification system only on Commons, or will there be different classification systems on each local project?

4. and if there would be different systems, which of these different systems would the filter apply?

--Rosenkohl 10:50, 17 August 2011 (UTC)

See #Categories - it doesn't work for the report's suggestion and why that solves nothing.
There would need to be a universal classification system on any project (of the 777+) that supports images. Otherwise the system would not be culturally neutral. (This could perhaps be done by using an XML/RDF markup system such as POWDER, with chuck-sums, attribution etc.) Since the system is to be culturally neutral there will need to be dozens, perhaps hundreds, of such categories, and every image will need to be checked for every category (of course a picture of a naked women is also a picture of a woman and a picture of a person, and a picture of a living being, etc.). Quite how these categories would work, since, for example, cultural norms are constantly shifting, and vary from town to town (and even within towns) let alone between countries, is I suppose up for discussion. Perhaps a really detailed ontology and a rules based system? But I jest - we manifestly do not have the resource (or the will) for the AI that would be required to drive such a system - realistically this is a bad idea masquerading as a great idea. In this case the devil is in the detail, some ways of working around those problems - and other problems that I have outlined - would take is into even murkier territory. Rich Farmbrough 14:27 17 August 2011 (GMT).
Worse yet, there is no method to make such classification — even if it were perfect and culturally neutral — impossible to abuse in order to push an agenda. Want to alter perception of subject x? Simply manage to mark it as offensive according to y. Add a few socks to the mix, position a few community leaders on polarized ends of the debate and voila! instant shitstorm.

As someone stated above, there exists exactly two neutral filtering criteria: filter everything and filter nothing. Everything else is someone making a subjective value call that renders the system useless at best (and actively harmful as social pressure and technical enforcement will be used to turn on that filtering against readers' interests). Think this is hyperbole? Is an image of a topless man "(partial) nudity"? Is one of a woman's? What about bare arms? Or hair visible? And we're not even getting in the incredibly complicated terrain of religious imagery. — Coren (talk) / (en-wiki) 14:59, 17 August 2011 (UTC)

Please tell me: does it means, that if I create a new picture, I have to sign according these rules: violence, sex, medical, etc? Does it means that I can see all pictures while editing, but look filtered, if an edited title get ready? JZ 20:54, 17 August 2011 (UTC)

clarification added

I've added the following clarification. AFAIK the Wiki(p|m)edia community generally dislikes ambiguous statements, so hopefully it should not be difficult to reach consensus on exact wording, pending consensus on the title change. i added:

<!-- clarification --> ''This page concerns a consultation about the details of how to implement a decision that was taken by the WMF Board in June 2011. It does not concern the [[:w:referendum|possibility of accepting or rejecting that decision]].''

Hopefully that should clear up the repeated confusion (including my own initial confusion due to not reading carefully enough). Boud 10:49, 17 August 2011 (UTC)

Visible to anonymous users who can't vote?

Why can I see the banner when I'm not logged in, when anonymous users aren't eligible to place a vote? Surely this should only be appearing for logged-in users, unless the voting eligibility is changed (as suggested above? Mike Peel 16:53, 17 August 2011 (UTC)

Because a lot of people read without logging in. The idea is that will trigger them to log in and vote. 16:56, 17 August 2011 (UTC)
Yes - but only a tiny fraction of Wikimedia's 400 million readers have user accounts! You can't be serious that displaying this banner to everyone, where only less than 0.1% or so of those viewing it will be able to vote (who could most likely be reached by restricting the banner to logged in editors only), is a worthwhile use of reader's screen estate?? Mike Peel 17:10, 17 August 2011 (UTC)
I actually completely disagree with "who could most likely be reached by restricting the banner to logged in editors only" we have a purposely low bar (I think that's a good idea) and an enormous number of those people are NOT going to be reading logged in and would NOT get caught by restricting it to logged in editors only. I may be mistaken but I think the board elections do the same thing (with a much higher voting bar). Jalexander 17:27, 17 August 2011 (UTC)
This will only cause frustration and confusion for all the unregistered users who want to participate in the referendum. They will follow the banner link only to find out that they are not allowed to vote and that they can't even change that by registering now. Please only show the banner for logged in editors. --Wkpd 17:30, 17 August 2011 (UTC)
It will be backed way off at the end of the first week of voting... until then, let's just let it go? We know that the current system is working, based on the extremely high number of votes cast. Philippe (WMF) 17:32, 17 August 2011 (UTC)
A week? That's about 3 _billion_ banner impressions! [1] Mike Peel 17:35, 17 August 2011 (UTC)
I argued above that the barrier should be lowered even more than it has been - so every Wikipedia reader can vote - but since that hasn't been done, having this banner visible everywhere really isn't a good thing. I thought the WMF board elections were logged-in only, but I could be wrong. Either way, it would be far better to e.g. email all registered users than to stick a big banner at the top of every single pageview of Wikipedia, by anyone, everywhere. Were we in the situation where 99% of our readers were also our editors, fine - but we're clearly not in that scenario. I'll somewhat cede the point about logged-out editors not noticing it otherwise - but the cost far outweighs the benefits here. Mike Peel 17:34, 17 August 2011 (UTC)
Out of the universe of people who are eliglble to vote, the vast majority of them will not be logging in during the two-week voting period, because the vast majority of people who are eligible are former or infrequent editors. However, I would imagine that a good majority of eligible voters will be browsing Wikipedia during that two week period, because "everyone" reads Wikipedia to get some knowledge or another.. Based on that, I would believe that the best way to reach eligible voters is to display this to users who are not logged in. That's how I found out about it, anyway. I wouldn't have found out about it otherwise. I agree, e-mailing might work too, but the banner just seems easier. JYolkowski 19:15, 17 August 2011 (UTC)
It indeed seems easier. And it is very effective. However, the same goes for shooting nuclear missile at a cloud of musquitos. Surely many mosquitos will be hit. However, I have to agree with the overkill argument. We are throwing so many sitenotices to people, and this is typically one that we can do without for a huge group of people. It is not efficient at all. Goodwill is important to Wikimedia - and warming people up with a question that they can give their opinion on and then denying them to actually vote doesn't help with that. Nor does showing them banners every time while we tell them that wikipedia is so great because we dont have ads. I can imagine that several people would consider messages like this which is really internal stuff to be spam to them. Are we really serving the readership with this? Or free knowledge? Effeietsanders 19:50, 17 August 2011 (UTC)
I for one usually visit wikipedia as an anonymous user and only log in when I want to edit something which I want to be able to find again (via the "view your edits" function), which only happens once in a while. However, sometimes when I can't be bothered I edit as an IP as well, therefore I have no objection to everyone seeing the note. Furthermore, if the image filter is not implemented, the concerned people (i.e. the ones who are easily offended by pictures of breasts, vaginas and penises) will know they failed and that they should grit their teeth and bear it (OMG Breasts! Qick - look away!)--79.234.118.10 11:11, 18 August 2011 (UTC)

I think I got an idea that bridges parties from both sides. Why not dedicate a sub-page specifically for IPs to vote on? This way, while the IP results are technically non-binding, at least we can gauge the support/oppose ratio from the IPs. This is sort of like the "neutral" section of RfAs so that while the vote doesn't count, it allows others to be informed of their views. We can add a link to the Special:SecurePoll/vote/230 landing page and point the IPs to vote at the dedicated sub-page (if IPs could reach and view the contents of landing page in the first place...) We'll also add a sentence reminding the IPs that they should login to vote, if they have an account, and only vote with the account and not vote with both account and as IP. OhanaUnitedTalk page 15:20, 18 August 2011 (UTC)

Terrible idea!

This is the first I've heard of this proposal, but as a long-time Wikipedia user, I think it's an absolutely terrible idea. How on earth did this ever get off the starting blocks?

I don't have an account, and no I won't register for one, so therefore I don't have a vote. But like millions of other users, I'll be affected by whatever the outcome is. This is particularly so if the decision is that you have to log in to remove censorship.

Wikipedia was an excellent idea when it started. The encyclopedia that anyone can edit! But no more. I helped out with quite a lot of articles in the early days, but it's being going progressively downhill for the past few years. I really don't know how long it will be until I abandon it. --88.108.222.49 20:02, 17 August 2011 (UTC)

Hello, and thanks for posting. We wish we could've included anonymous users like you in the voting, but unfortunately that would have been very hard to implement technically and it would make it easier for people to use sockpuppets to game the system. Just so you know though, this filter would be completely opt-in, both for anonymous users and for logged-in users. You wouldn't have to login to remove anything, because it would only be there if you wanted it to be there. Both logged-in users and logged-out users can both choose what is filtered out for themselves. The only difference is that the logged-in users will have their preferences saved for a longer period of time, because it's associated with their account, while the logged-out users might have to update their settings periodically because it'll most likely be cookie-based. Cbrown1023 talk 20:09, 17 August 2011 (UTC)
this filter would be completely opt-in : IOW much drama is being expended on something that even if it is implemented will not so fuxored from its inception as to be useless. John lilburne 20:35, 17 August 2011 (UTC)

Categories as "filter tags"

Will it be possible to use file categories (Wikipedia and Commons categories) to "batch filter" all images in one or more of a set of categories?

If this isn't the right place for this question, please point me in the right direction, thanks! --87.78.139.205 20:01, 17 August 2011 (UTC)

The idea is that the filter system would fit in directly with our existing category system. So, if I understand your question right, yes, that's the intention. :-) Cbrown1023 talk 20:10, 17 August 2011 (UTC)
However, anyone examining the categories will rapidly see that our existing categories are not suitable so that a new set would need to be devised and maintained. Rich Farmbrough 23:08 17 August 2011 (GMT).
My guess is that file categories will be closer maintained once the filter is implemented. If they turn out to be somehow unsuited to the task, a separate category or tagging system can still be discussed. --213.168.119.238 17:10, 18 August 2011 (UTC)
They are unsuited, though not maybe completely useless. We do not have categories which include, for example, all nudity. The first technical decision would be how to implement the filters, but this would need to be informed by how they are to be used. For example if there are from a dozen to a thousand categories, using a bit-field in the rendered page associated to each image URL might work well - if individual users are creating their own lists this would be impractical. Then based on this and other user requirements the method of mark up could be created. It would ideally not be part of the wiki-markup, since we would want it to include an authentication (possibly a SHA1) of the image that would be checked at render time and un-editable attribution. (Decisions would then need to be made on what to do about images which had changed since they were tagged/categorised.) Rich Farmbrough 17:42 18 August 2011 (GMT).

If I wanted to design something that worked...

To be clear, I'm against image filtering, I voted against this proposal, and I think the solution is that children and adults should learn not to be offended or overly affected by images. Children around the world have had to become accustomed to living in garbage dumps, being child soldiers, being child prostitutes - I don't think the sheltered child who runs across a naked lady or a dead animal on Wikipedia has much to complain about.

But I recognize that for certain people the risk that they, or more likely one of their children, would run into some images is disturbing, and limits their use of Wikipedia. Obviously I want them to have as much access as possible. Now if they want to cut out the images, clearly a professional service (as mandated in schools and libraries in the U.S. under w:CIPA) should provide a better answer than the general mass of Wikipedia editors who have varying beliefs. However, the basic principle of Wikipedia is that a group of interested amateurs collaborating together willingly should be able to challenge a commercial product. Thus, we could, if desired, work out a viable image hiding scheme - just not this one.

A workable software scheme to accomplish this would be as follows: the first individual user, beginning from scratch, would write down or copy the names of the images that offend him into a text file, preferably with wildcards, perhaps with a text comment field afterwards that would replace the image when displayed.

Lilith (John Collier painting).jpg I'll never get any work done...
Dianne_Feinstein_2010.jpg If I never see her face again it will be too soon
Dianne_Feinstein* That goes for all of them
Dianne*Feinstein* Missed http://en.wikipedia.org/wiki/File:DianneFeinstein.jpg

So User:First puts this in User:First/ImagePrefs, installs/activates the script, and the new program stops these named images and groups of images from appearing.

Then User:Second decides he agrees with User:First, and has some ideas of his own, so he creates a file of his own at User:Second/ImagePrefs

{{User:First/ImagePrefs}}
John Boehner*
*John Boehner (though performance considerations might prevent initial *?)
Boehnerandstivers.jpg

Now eventually of course you have some editor who wants to provide services for all her co-religionists, so she goes over many files carefully and sets up a huge directory on her page, and several of her friends do likewise. They continuously prowl the earthiest entries and categories looking for new stuff to put in their directory, and all include each other as templates. Then anyone from their church starts an ImagePrefs file consisting solely of:

User:DefenderOfTheFaith/ImagePrefs/WikiPorn

and there they have it.

So, having laid down this idea, what would we need to do things effectively?

  • A good fast search through potentially hundreds of thousands of objected-to image names in a person's ImagePrefs, to find out quickly if the image to be displayed is one of them. Question: who takes the performance hit, the user's browser or the Wikipedia server?
  • A new parser function, {{#sort|duplicates=no}}, which can be applied to a whole ImagePrefs page and spit out every single entry in an alphabetized list minus duplicates.
  • A means to suppress display of the circular template errors, while of course not following any circles.
  • A way to tackle searches with wildcards - perhaps they could substitute in the list of hits, then attach a date parameter so they would only search files from newer dates on the next iteration?
  • A forum where Wikipedia editors can find people who block out images reliably according to their particular principles. But that can be provided elsewhere. Everything here is purely personal preferences, no admin intervention of any kind required.
  • If seeking to mandate this on children, a "user protection" feature might be invented. This would be a protection template applied to a page (other than the main user talk page) in user space which allows only that user (or admins) to edit the page, working exactly like the typical admin full-protection done to articles except the user or someone designated by him can edit the page. Thus a parent who sets up a child's account would add a protection template to the ImagePrefs, allowing only the parent's own Wikipedia account, to which in theory the child doesn't have the password, to change the page. I'm not saying that's a good idea for us to get into though.

Now I don't think of these things as any kind of priority for Wikimedia - eventually we'll see the poll results and we'll know for sure. But if the voters actually want to go forward with some image scheme, then this is one of many alternate ideas that I believe would work better than the currently proposed "consensus categorization" scheme. I've also proposed a method to work with outside organizations [2] and just a very simple idea to let users specify the image resolution for all thumbnails, overriding the Wiki article's settings, and make it something very small. I continue to urge WMF not to try to have Wikipedia classify images. Wnt 21:04, 17 August 2011 (UTC)

And there we go: Maude Flanders is already out, a few hours after my original post. If we implement something like the "user protection" feature, soon a billion people will be children of the Chinese state.77.241.141.94 21:23, 17 August 2011 (UTC) (yes, I'm the guy of the Impractical and Dangerous post above; apparently, my English username is taken by some cybersquatter on wikimedia).

Protest voting

I voted (I'm an IP here, but I have an account as Anonymous44 on en.wiki) by leaving question marks on all questions to protest the lack of a clear commitment to - or the ability to specifically vote for - a restriction on the nature of the images that may be hidden. In my opinion, only images that are more or less universally disturbing or shocking on a purely physical level (death, mutilation, disfigurement, defecation, sex) may be hidden. If any image that somebody somewhere may consider "controversial" or "offensive" on the basis of their political stance or religious taboos can potentially be flagged, the technical feature had better not be available at all. I will be glad if other people express agreement with this stance.--91.148.159.4 22:46, 17 August 2011 (UTC)

I would have sugested that you vote 0, 0, 10, 0, ?, ?. Basicaly it means: I don't care about this feature, its very important to disable it and the rest is out of question. --Niabot 00:18, 18 August 2011 (UTC)
The reason I didn't consider this to be an appropriate expression of my position is that I do think the feature would be valuable, as long as there is a clear commitment to a restriction of its use to the kind of cases described above. The problem is that the present phrasing of the proposal does not allow the expression of such a position, indeed it implies the opposite, as explained below.--91.148.159.4 00:29, 18 August 2011 (UTC)
(after edit conflict)Perhaps I should add, as a further justification of my position, that the current formulation of the proposal concludes with a question that practically excludes a restrictive application of image hiding! The question asks whether one should take into account "global or multi-cultural" views of what images could be considered controversial. Endorsing this clearly means endorsing the creation of lists of taboo images for every religion or sect anywhere in the world - which may involve images offensive to Christians (Piss Christ etc.), as well as the banning of all images of women for certain Jews, or of women without headscarves for many Muslims. Taking into account all possible taboos of this sort would lead to a ridiculous amount of hidden images. And the question is phrased in such a way that, if you don't answer positively, the immediate implication is that you are a chauvinist that seeks the imposition of some local, nationally restricted or particularistic view of which images deserve hiding - a position, which, of course, totally contradicts the principles of Wikipedia and is very uncool.--91.148.159.4 00:29, 18 August 2011 (UTC)
If this is actually a fait accompli, as many documents seem to imply, I think the appropriate response from the community would be boycotting the next fundraising: if the money are used to improve the servers, and the servers to make useless, dangerous and impopular software, we can get a riper wallet and a clear conscience. Ditto if the money is used to pay self-styled experts to provide sanctimonious mumbo-jumbo as a basis for such implementation. But my impression is that the introduction of censorship is the first step towards a wikipedia that no longer is supported by its users but by some kind of political or corporate mastodon, and which will, most likely, soon be superseded by some variety of google product. 77.233.239.172 10:21, 18 August 2011 (UTC)

Correction

s/Read the questions and decide on your position./Read the statements and decide on your position./

Thanks. Rich Farmbrough 23:13 17 August 2011 (GMT).

Good call, Rich. Done. :) --Mdennis (WMF) 23:39, 17 August 2011 (UTC)

Hiding the Pokemon Logo as Inappropriate Content... Really?

Ok, so I know I have no right to vote but I have used Wikipedia for about 5 years as multiple different anon IP accounts because I value the ability to contribute anonymously. I must say that I find it a bit amusing and hilarious that the example from the Image Filter Referendum info page is the Pokemon logo. The Pokemon logo is harmless. We shouldn't be creating things akin to parental controls that can hide relatively harmless images and hinder free and easy access to knowledge online, which I feel is the point of Wikipedia. In fact, I really feel that this has little if any place in Wikipedia at all. But if the community agrees it does, it shouldn't be used to hide corporate logos and rather things like nudity. Not saying we should necessarily do that though... I just don't see the harm in the Pokemon logo. If anything is harmful for this website and its goal, its this referendum. Consider this a voice of opposition to the proposal.--98.112.224.106 23:35, 17 August 2011 (UTC)

It's a joke, used as an example image -- not an example of something that someone would want to hide. (Though I suppose you never know...) -- phoebe | talk 13:45, 18 August 2011 (UTC)
I believe that the Pokemon logo was chosen because it is innocuous. :) People are far less likely to argue about the example image if it's not borderline...and I'm sure nobody wants to offend readers by selecting an image that is "not borderline" in the other direction. :D --Mdennis (WMF) 23:46, 17 August 2011 (UTC)
Maybe it's an allegory to the old anti pokemon figure war. ;-) --Niabot 00:06, 18 August 2011 (UTC)
Personally, I find it even more "amusing" that the example is a copyrighted picture (representing a trademark, to boot) and that there is no stretch of "fair use" that can accommodate this particular use. — Coren (talk) / (en-wiki) 00:29, 18 August 2011 (UTC)
A rationale for this image would go something like this:
  1. Purpose and character of the use, the "transformation" test. The original image is for entertainment. The use here is for a completely different purpose, part of a discussion of censorship mechanisms at one of the most popular web sites on the internet. The use is not for profit and may serve to assist more generally in understanding of censorship issues in general. As a highly transformative use this one clearly strongly passes this part of the four factor test. If you are familiar with the English language encyclopedia policy and believe that it contradicts this analysis, you are correct - this is the area where the policy is most different from the law, requiring uses that weaken the fair use argument. I'm sticking to the legal fair use definition, not en policy.
  2. Nature of the work: The work copied is an original creative fictional image. This gets quite strong protection against reuse, particularly in commercial contexts.
  3. Amount and substantiality: substantial portions of reduced resolution are used. The reduced resolution mitigates the use, in part because there is a requirement that the image be recognisable for the use to be effective. Some of the example content filters may block the display of this image: Children's games, Monsters and Horror, with other possible filters that could include it (violence, morally offensive to practitioners of some religions) and it is of some value to use an image that most adults may not find objectionable but which could be blocked by content filters. While the image is original and creative, the use makes this at worst only a weak negative factor.
  4. Effect on the work's value: no likely effect, partly because of the resolution and partly because its use is outside the types of project in which the image is normally sold.
  5. The fifth factor: does it look fair? Not a formal part of the four factor test but commonly used, this use for public debate in a non-commercial context strongly suggests that the use is fair and well within the Constitutional purpose of fair use to allow use of images for discussion and debate.
For the forgoing reasons, I see no difficulty at all in this use passing a fair use examination. If you disagree, I strongly recommend starting with the encyclopedia article and then going on to read some of the leading fair use case rulings, which often provide very useful reasoning behind the ruling. Jamesday 02:45, 18 August 2011 (UTC)
  • That's not the direction where case law has gone in the past. Using a trademark or copyrighted work has unfailingly been allowable when doing commentary on that work (whether it is scholarly, parody, or social commentary), or some symbolic significance of that work (as the various Barbie-as-symbol cases have shown), but have been considerably less lenient towards "random" use when basically anything else could have been used for the same effect.

    In this particular case, the message would have been exactly the same had some free image been taken — or perhaps not: it certainly wouldn't have illustrated the Foundation's hypocritical attitude towards its own pronouncements as well? — Coren (talk) / (en-wiki) 13:06, 18 August 2011 (UTC)

It's probably not a big deal, but the licensing on the images says " Logos (if they appear in this screenshot) of the Wikimedia Foundation are copyrighted. Usage in media and press reports about Wikimedia and its projects is permitted, any other usage needs permission." - this is on pages with no WMF logos, only Pokemon logos. Do we now have a "Pokemon attack" to go with the "Pokemon defence"? Rich Farmbrough 15:11 18 August 2011 (GMT).
The typical "we care about the project" approach. Telling everyone that the content has to be free and that we have fundamental project rules, while obviously not caring about it at all. --Niabot 00:35, 18 August 2011 (UTC)
WMF projects allow for exemptions for logos, and I disagree with Coren that "there is no stretch of 'fair use' that can accommodate this particular use" (it's not even a little bit of a stretch, IMO). That said, I do agree that a freely licensed image would have been preferable as an example.--Eloquence 00:55, 18 August 2011 (UTC)
Same goes for the content of the pages. The content is licensed under CC-BY-SA and GFDL. The images are currently under the incompatible GPL. In any way no good example in understanding how to work with licenses. --Niabot 01:06, 18 August 2011 (UTC)
I find non-free content offensive, but I would prefer it to be removed, not hidden from view. Kusma 01:58, 18 August 2011 (UTC)
LOL :-). Someone was actually asking me whether it might be possible to use this feature to browse Wikimedia projects in "libre mode", i.e. to hide all fair use content from view. (The answer is "probably not", unless lots of folks agree that it would be valuable.) --Eloquence 02:01, 18 August 2011 (UTC)
importScript('User:Anomie/linkclassifier.js'); and .nonfree-media{display:none} on ENWP. --Yair rand 03:17, 18 August 2011 (UTC)
Might I add that nobody at WMF seems to be able to spell the word "gladiatorial." And really, "gladatorial combat"? Pokemon?
This supposed feature is a farce. I guess that's what you get when your project is run by corporate lawyers cum non-profit board members. This is complete buffoonery and everyone knows it. Maybe that's the tactic to take in order to fight this - convince the board of trustees that their idea, above all else, is just stupid, that they are stupid and should leave wiki policy decisions to the people who are the lifeblood of the entire project.
Being friends with Jimbo does not qualify one to make unilateral wiki policy decisions. Belonging to the same golf club as Jimbo does not qualify one to make unilateral wiki policy decisions. Being a darling non-profit organizer who sips aperitifs with donors at fundraisers does not qualify one to make unilateral wiki policy decisions.
Seriously, ever take a look at who exactly is on the Board? Stu West, B.A. in History from Yale. Specialty? Nothing, except for being in charge. Matt Halprin, "corporate development" specialist (that is to say, specialist in nothing).
The only reason anyone contributes anything at all to the project is the promise of democratic policy-making. Take that away and you erode the entire foundation of the project. A wiki without editors is no use to anyone, especially the trustees. Your non-profit board member prestige lies in the size and significance of your non-profit. Trash the wiki userbase enough with this kind of blatant disregard for the very democratic process that made the project anything in the first place and eventually nobody will care about you or your project. 98.217.75.153 05:01, 18 August 2011 (UTC)

Shouldn't it be La Gioconda or tux ?

190.51.185.183 13:54, 18 August 2011 (UTC)

I also has some concern of the licensing of the logo. If we're making example, we definitely should opt for license-free ones, like the file:Placeholder.png. -- Sameboat (talk) 15:23, 18 August 2011 (UTC)
+1 that's not like we miss free pics to illustrate Cherry 19:01, 18 August 2011 (UTC).


Oops

I voted for this and only later discovered the discussion here. This discussion has dramatically changed my view on the subject. I feel that the way the question was presented and the way the questions were asked (and the large amount of space dedicated to how the images would be presented) gave the feeling that it was a forgone conclusion that the filters would be seen in a positive light and that it would simply be a harmless change to the wiki (i.e. the referendum was a simple procedural vote to make policy official). Having then seen the discussion it didn't take me long to understand peoples concern with image filters and sympathise with them. It's a shame that this discussion is just a small link to the right of the main part of the page. Why is the vote presented in a very one sided way with reasons against the image filter only posted on this discussion page linked in a small box to the right of the main section of the page? I don't understand. --Shabidoo 06:19, 18 August 2011 (UTC)

Try to vote again, it should overwrite your previous vote (at least that is how it worked in the past). Kusma 06:36, 18 August 2011 (UTC)
I did vote again and changed everything. It is still really revolting how the questions were formed, ESPECIALLY the FAQ, and how this discussion is STILL some little link on the right side of the screen, like some sort of after-thought. Really repulsive. --94.225.163.214 18:01, 18 August 2011 (UTC)

Real Vote or Fait Accompli?

I've voted "0" to this ridiculous proposal that will surely only open the floodgates to Censorship on Wikipedia, and furthermore I object the very notion that some people should be allowed (of forced in the case of Children, etc) to view a different Wikipedia then everyone else. I object to this on every level (and yes I read the 2010 report on controversial topics, or whatever it's called, and found it shockingly pro-censorship and not in the spirit of Wikipedia). So, is my vote of zero actually meaningful? Or has this already been decided by whatever board is apparently in charge? If this is just a talking exercise or "consultation" for something that has already been decided, then I (and I'm sure others) demand a real vote with a clear "yes" or "no" option on this "self-censorship" (you can guess how I would be voting). --Hibernian 06:18, 18 August 2011 (UTC)

Yeah. I'm not strongly opposed to the filtering plan, but I agree that a clear "no, we should never do this" option should be available. --Phatius McBluff 16:36, 18 August 2011 (UTC)
I would like to suggest, in the interest of Assuming good faith, that this is neither a real vote NOR a Fait Accompli. It's an experimental poll mistakenly labelled as a referendum that is going to use voting software. I don't think it's meant to be decisive, I think it's mean to be informative, to get info from the parts of the community that don't discuss, or at least, don't discuss in a language our board can easily access.
The main thing they need help with, I think, is deciding whether to make a "good enough" filter or a 100% culturally neutral filter. The latter is a lot more consistent with our values, but we assume it'd be harder to build.
"no we should never do this" is still an option-- the board listens to discussions all the time, they haven't stopped now either. This is one of the first time they're going to listen to a second source of information.
The community needs this poll more than the board does, I think. All the young people I know in real life think a filter is silly-- but if I knew large swaths of potential editors really did want this feature, that would help me reach consensus in their direction, even if I personally have other preferences. --AlecMeta 19:27, 18 August 2011 (UTC)

No

No! Jim Shearer, Fruitland MD. 162.83.38.30 10:19, 18 August 2011 (UTC)

crying

ikm doing nothing join your cause in doing everythjing!

But - will *they* stop whining?

Question: If the feature is implemented as suggested, will those people that are easily offended by articles having pictures of what they are about (i.e. people who are endlessly complaining on the talk page that the article vagina does have an actual picture of a !gasp! vagina in it) actually stop complaining? --79.234.118.10 11:24, 18 August 2011 (UTC)

Heck, no, they will ask for more, namely that vaginas are not shown to other people, either. Isn't that the big picture? 77.233.239.172 11:40, 18 August 2011 (UTC)
brought to the point! --Re probst 15:35, 18 August 2011 (UTC)


Wording of text for hidden images

This may seem like a minor point, but here it goes. In the mock-up for the hidden image display, there is text that says, "This image has been hidden because it matches your filter settings." This statement is ambiguous: it could mean "This image has been hidden because it matches your filter settings for what to hide" or it could mean "This image has been hidden because it matches what your filter settings for what to show". Of course, the former is what is intended, and I don't think most people will fail to recognize that. However, in order to avoid ambiguity, shouldn't the text instead say something like the following: "This image has been hidden in accordance with your filter settings"? --Phatius McBluff 15:40, 18 August 2011 (UTC)

+1 the mock up text above isn't faithfull. "This image has been hidden because it matches what your filter settings for what to show" seems the most exact to me, but "This image has been hidden in accordance with your filter settings" may be better. Cherry 18:48, 18 August 2011 (UTC)

A "referendum" is a direct vote in which an entire electorate is asked to either accept or reject a particular proposal.

Wenn ich alle Fragen mit "0" beantworte und viele andere Leute auch, kann das dieses Projekt stoppen (nicht zuletzt um die Ressourcen für sinnvollere Anpassungen aufzusparen)? Sollte es so sein, dass ich dieses Projekt mit meiner Stimme (und deren von Gleichgesinnten) nicht stoppen kann, dann habe ich wirklich eine anderes Verständnis für den Begriff "Referendum" als die Initianten. Ich fühle mich da schon etwas genötigt. Zuerst müsste doch die Frage gestellt werden: "willst du das" und dann erst (bei einem "Ja") wie soll das ausstaffiert werden. --Re probst 15:55, 18 August 2011 (UTC)

Die Entscheidung hat das Board dir abgenommen, genauso wie dem Leser die Entscheidung abgenommen wird, was in welche Kategorie gehört. Da diese Frage nicht gestellt wurde, haben wir vor sie selbst zu stellen und entsprechende Konsequenzen zu ziehen und sie der WMF zu übermitteln: Meinungsbild – Einführung persönlicher Bildfilter. --Niabot 16:18, 18 August 2011 (UTC)

General view + essay

Long essay that says:

  • Filtering is anti-wiki.
  • But the proposed filter is good enough.
  • If lots of people want it, let them have it.
  • A filter will help us relax over 'scope' concerns.
  • Easing scope concerns should help us grow and expand. --AlecMeta 17:48, 18 August 2011 (UTC)

Referendum?

I am very puzzled about why this is called a "referendum". Even according to Wikipedia's own article, "A referendum [...] is a direct vote in which an entire electorate is asked to either accept or reject a particular proposal." In this case, as I think several people have already mentioned above, the exercise instead seems to be about deciding how to implement something that it's has already been decided will go ahead. (See wording like "The Board of Trustees has directed the Wikimedia Foundation to develop and implement a personal image hiding feature. [...] The feature will be developed for, and implemented on, all projects.")

Unless I'm thoroughly confused, "referendum" seems to be completely the wrong word. If this is really not a referendum at all, could someone please remove all references to that word? 109.151.39.170 20:19, 18 August 2011 (UTC)


Way too complicated/burdensome process.

I came to this page expecting to click a button and have my vote count. You want me to do what?? Follow a bunch of steps hidden halfway down the page and then go to another page and copy/paste stuff and THEN follow instructions there? Nuts. This is why wikipedia editor numbers aren't exploding. You're making this stuff too hard, and too hard to get into. Guess I'm too casual for you hardcore programmer/coder types. Parjlarsson 16:43, 17 August 2011 (UTC)

I'm and wiki editor and I'm a software developer and I can't figure it out either. I've tried to follow the instruction to the best of my ability and I get nowhere, either I get the you not Sorry, you are not in the predetermined list of users authorised to vote in this election.

- Finally figured it out, I have to go back to my preferred wiki project copy the second part of the link listed in the eligibility section. This is far to complicated for most users/editors. There is an old proverb that I think should be applied K-I-S-S.

- I'm a wiki editor and I'm a software developer: I still connot figure out what my "prefered wiki project" is.

I would like to Move that if we have low participation in this vote the vote be scrubbed and a new vote with steps that most users can follow be performed as with low participation we don't really have consensus, and the collective conscience will not be represented. Abyssoft 03:22, 19 August 2011 (UTC) http://en.wikipedia.org/wiki/User:Abyssoft

Process

Unless someone makes a case for a formal referendum process that is approved as a standing part of WMF governance, no outcome will be legally binding on the WMF without a Board resolution. However, any group within the WMF can set up a community referendum and commit to being bound by its outcome... which I believe is what's going on here.

Certainly the discussion and revision process of RfCs, rather than a focus on voting, is desirable. Discussion of concerns and suggestions raised, and the outreach involved in responding to them, will be as valuable as the vote itself. I don't know how specific the referendum text will be, but I hope that the first outcome of holding one is feedback that gets incorporated into the proposed implementation. SJ talk | translate   04:02, 1 July 2011 (UTC)
I'm a little confused - for whom will the outcome be binding? aleichem 01:50, 24 July 2011 (UTC)

discussion (how to get large-scale community feedback)

One thing that I am interested in is figuring out how to get large-scale community feedback on issues like this. This referendum is very much an experiment; long-term Wikimedians will know that we have almost never done something like this (the licensing switch is the last large-scale vote we did, and aside from the Board elections there have been very few others). So: Why this? Why now?

Let me review a bit of the history of this project. This filter idea has already been through something of an unusual process, in that the idea was suggested in a study that was contracted in order to get more information and compile perspectives around the ongoing (and somewhat intractable) community debate around controversial content; the author of that study tried hard to get community perspective in writing the study, as well as afterwards. It was then discussed in the "normal" way on Meta and mailing lists; the whole thing then went to the board, who spent another few months reviewing and discussing it internally. The Board's final decision, which was influenced by but not wholly inclusive of the report, is in these two resolutions: wmf:Resolution:Controversial content, and wmf:Resolution:Images of identifiable people. There had been a great deal of community pressure on the Board to say something, as well as an internal feeling that we needed to lead on this difficult issue.

So the Board felt like some sort of feature for readers to not have to see images they find objectionable would be a useful feature for Wikimedia sites, as a top-ten global website, to have. Unlike everyone else in the top ten, however, we don't want to make it hard to view a totally unfiltered Wikipedia; we want people to actively choose (unlike Google or Flickr, which have default safe-search on). And we don't believe in censorship (as distinct from editorial judgment, which we all use all the time on the projects). But we recognize the large constituency that does want some sort of "safer" browsing experience.

Balancing those needs and principles is the job of the team that will design and ultimately roll out the feature. (Including, to quote SJ above, the possible tradeoffs between that helps some users but frustrates others, vs. a better solution that takes more energy to develop). But it's not like a behind-the-scenes piece of code that doesn't affect anyone; both editors and readers will need to use the thing, and have input into its design, for it to be useful. Here's the thing about community discussions on Meta, or the mediawiki wiki though -- there's a relatively small number of people from a relatively small number of Wikimedia communities who participate in such discussions. There's often a sensation of being in an echo chamber, in fact, with the same people participating.

So we are trying something new. Think of it as a very, very large-scale RFC, designed as a vote so that way more (orders of magnitude more) people can usefully participate. So far, that aspect is working -- there have been thousands of votes cast. Even on this page, there are tons of new people participating (as well as some that have before), which I think is great -- especially after having been personally involved in discussions about this for the past year and a half :) The idea is: how do we get lots of Wikimedians across all communities and languages involved to get as broad a view as possible?

So it's an experiment. I am sure that the questions are not the best. I know there are problems with the voting interface. I know that there's no explicit up or down vote on whether you want this. I know that many people see this as an affront to their principles; I know that many others probably just don't care or are confused; and others want to see something like this or something even stronger implemented. But, this referendum is how we are attempting to collect this feedback. If it doesn't work, fine -- we will know for next time. It hasn't been uncontroversial, internally or here. But we (the WMF, and the volunteers who have worked on this) thought it was worth giving something new a shot. -- phoebe | talk 14:07, 18 August 2011 (UTC) (member of the Board of Trustees, 2010-2012)

p.s. there's also lots of speculation about the board on these pages; if you want to learn more about the ten trustees, we all have bios here. -- phoebe | talk 14:34, 18 August 2011 (UTC)
Unfourtunately there is no way to contact these persons. --Eingangskontrolle 11:52, 19 August 2011 (UTC)
I think the idea of a referendum, although problems and issues have been identified, has been reasonably uncontentious. The issues raised here have been mainly about the handling of this particular exercise, and the subject of this particular exercise.
And I also think that by and large the community accepts that this is new territory and we have all too learn as we go along. For example I just found "The Wikimedia Foundation, at the direction of the Board of Trustees, will be holding a vote to determine whether members of the community support the creation and usage of an opt-in personal image filter" - at some point this message and what the survey appears to be attempting have diverged. This is a communication and process failure, no big deal, it happens all the time. The Wiki-way would be to fix it. However we are now in a "mid-stream" position, and being told things are (mainly) frozen. Again we understand the GF motivation behind this - while some of us think that the problems of the process and wording are great enough to throw doubt on any results, quite apart from the substantive issues that have been raised.
So really the questions we should be asking now are:
  1. Is the referendum/vote/survey process a good one? And for what types of issues?
  2. What lessons can we learn form this exercise?
  3. Given the discussion and votes what should we do about image filtering?
Rich Farmbrough 14:48 18 August 2011 (GMT).
Thanks. I totally agree, these three questions are the ones to ask. -- phoebe | talk 15:02, 18 August 2011 (UTC)
An fourth question would be if filtering of content would be accepted at all. I spoke to many in the recent days and there are quite a lot of contributers that are strictly against any filtering of content. There are many reasons for this. At first we try to make neutral judgment on what is problematic or not. That was never done before and strictly forbidden through NPOV rules. Categorizing content as offensive or not is complete opposite to that. It's also called a violation of the basic rules of an encyclopedia. One other thing are the obvious side effects. If we categorize images we indirectly categorize any page that contains the images. This could be used by goverments to selectively suppress (own filtering in Iran, China, North Korea and many other) information about distinct topics. We would allow to view the images, but they can implement filters based on that without this option. The third big argument would be the easy possibility to expand this approach by the WMF itself. You give supporters of censorship another good argument, if the infrastructure is already present, to make censoring of content the default (opt-in vs opt-out). So far i could not find any statement regarding this points:
  1. How does the board think about the conflict between the NPOV principal and hiding content based on personal judgment?
  2. Does it see a risk that the introduction of such categories might be exploited by interest groups?
  3. Will it be the first and only step, or will at be the first step and more to follow?
--Niabot 16:09, 18 August 2011 (UTC)
Phoebe, while this may in fact be a "request for comment" it is not what most people would think of as a referendum. It is clear from comments on this talk page that many people have not understood that implementation of an image filter has been mandated by the WMF and this "referendum" is merely about implementation. I asked Philippe, both here and on his talk page, to clarify that when I saw that it was causing confusion here. He ignored my requests. Perhaps you could add some clarification to the main page about that? I think your comment that there is "no explicit up or down vote on whether you want this" comes across as disingenuous since it has already been mandated and is irrelevant if users want or do not want it. As I suggested on Jimbo's en.wikwi talk page, I feel that any discussion of design and implementation is premature if the image classification process has not been worked out and probable users consulted. (Incidenatlly, I also pose in that discussion some of the basic questions which I should be answered and/or posed to the community.) I think as an experiment, this one has gone on long enough to see the result. Delicious carbuncle 16:56, 18 August 2011 (UTC)
Sure. I'd be happier seeing it called an RFC, or a consultation; this is misnamed as a referendum. -- phoebe | talk 18:12, 18 August 2011 (UTC)
But there IS no discussion. The vote is set up so that a user clicks on the links, reads a question phrased in a way which gives the filter a positive light and provides a FAQ making the filter seem harmless. This actual discussion is from a tiny link on the right side of the page, easy to miss as I did. Its pretty revolting actually. --94.225.163.214 18:03, 18 August 2011 (UTC)
Phoebe, you seem to have ignored my request to clarify the terms of the referendum. Was that because you think the terms are clear? It would be difficult to read this talk page and come to that conclusion. If you, Philippe, or another WMF employee/member are not going to clarify things, it would at least be polite to tell me why you are not doing so. Thanks. Delicious carbuncle 20:35, 18 August 2011 (UTC)
I don't know if it will answer your specific questions, but you may be interested in reading IRC office hours/Office hours 2011-08-18, in which Sue discusses the Referendum at some length, including her ideas for handling outcome. --Mdennis (WMF) 21:12, 18 August 2011 (UTC)
I'm sorry that you, personally, feel ignored by me. I am attempting to clarify and participate as much as I can (in my free time just like everyone else) on this talk page. The link Maggie gave above is a good discussion as well. The committee that put this all together (Philippe as coordinator, comprised of others including myself) is hesitant to go changing the wording in midstream; that is the main barrier to rewording the page. Not sure what else you're asking for (cutting it off entirely?) -- phoebe | talk 15:35, 19 August 2011 (UTC)
Having followed this for a bit, I totally concur that the referendum is a good thing. While there are lots of things to criticize about this one, I think the point is that the foundation is experimenting with ways to get feedback from massive groups of people who don't share a common language. This is a very important goal.
By referendum, they actually mean "secret-balloted, experimental poll". But that's okay-- they're learning too, and they're learning in the right direction.
Discussion is still more important than the poll, but the results of the poll, once made public, will be very helpful in reaching consensus.
The board isn't trying to jury-rig this poll. I know the options are very unclear, but it's a genuine attempt to solicit feedback from the community.
We need election/referendum practice. The last board elections weren't really "successful elections" with discussions of issues-- but since we got fabulous result anyway, now's the perfect time to practice elections. --AlecMeta 19:14, 18 August 2011 (UTC)

Process query

I understand the vote is being conducted by a third party. And yet the wiki on which the voting page is hosted appears to be run by WMF people ([3]). Can someone explain how this works? Rich Farmbrough 17:13 18 August 2011 (GMT).

the wiki's actually run by the same third party, SPI. They seem to have taken the WMF logo to identify it as being the wiki for a WMF-vote :) -- phoebe | talk 17:55, 18 August 2011 (UTC)
We don't have access to the server it runs on, and I'm pretty sure we don't have the key to tally the votes — only SPI can do those things. The only Wikimedia people with any other access on the wiki is the committee, who are "election administrators" in SecurePoll. That basically means we can strike votes that we think are done by sockpuppets. Cbrown1023 talk 18:53, 18 August 2011 (UTC)
And the advisers to the committee. OK that's much clearer. Rich Farmbrough 19:35 18 August 2011 (GMT).
To confirm, WMF does not have the key to tally.  :) Philippe (WMF) 13:41, 19 August 2011 (UTC)

Request for semi-protect of this talk page (translate: request to filter views)

This discussion page is increasingly plagued by anons who purely troll or even spam. Apparently they don't have a single clue of how the filter works and instantly label it with censorship. -- Sameboat (talk) 07:16, 19 August 2011 (UTC)

+1, seeing as only registered users can vote, I don't see why anons need to able to contribute here given the spam etc. Promethean 08:13, 19 August 2011 (UTC)
Completely against, criticism of this referendum (which is NOT a referendum but a vote of confirmation) is not spam. --94.225.163.214 09:27, 19 August 2011 (UTC)
Perhaps you should request a filter feature that will also allow you to "opt-out" of things you find offensive? — Coren (talk) / (en-wiki) 11:25, 19 August 2011 (UTC)
I don't find any humor in your words. A trolled discussion page only discourages real users offer constructive suggestion or question. -- Sameboat (talk) 14:32, 19 August 2011 (UTC)
That would be because there is no humor in my words. Perhaps a bit of half-amused self-referential irony, but the fact that you are unable to see the parallel with your demands that the very users who will be victimized by this "feature" (I.e. the much more numerous readers) be disenfranchised is cause of pity and sadness, not mirth. — Coren (talk) / (en-wiki) 14:36, 19 August 2011 (UTC)
I can only say one of my vote opposes turning on the filter for anons. -- Sameboat (talk) 14:46, 19 August 2011 (UTC)
The problem is that once your images are tagged, it becomes trivial for a network device or software not under the readers' control to filter, user desires be damned. I can think of a number of governments that will be pleased as punch at the Foundation working so hard to make their jobs of "protecting their citizens" that much simpler. — Coren (talk) / (en-wiki) 15:05, 19 August 2011 (UTC)
Readers who don't experience internet censorship need this feature. As for the readers who suffer from governmental internet censorship, things won't get particularly worse or better with or without this feature. -- Sameboat (talk) 15:15, 19 August 2011 (UTC)
Sorry, but "crap can already happen" does not justify making crap of our own in my book. — Coren (talk) / (en-wiki) 21:47, 19 August 2011 (UTC)

Another censor! This bug will be introduced to the benefit of the unregistered user and you want their voice to be ignored? And sofar noone has a clue how this filter will work, because A) nobody really knows or B) the one who know will hide the real thing. It feel more and more that option B ist true. --Eingangskontrolle 14:15, 19 August 2011 (UTC)

Please and thank you, read the content page about the filter before complaining. This is absolutely NOT censorship. Even a smarter child can revert the filter set by his parents so easily. How could such a thing be called "censorship"? I can't stand that there're so many lazy people blame the filter and the referendum before they actually understand the whole point of it. -- Sameboat (talk) 14:29, 19 August 2011 (UTC)
It is precisely because I've read up on this that I disagree with your assessment--this is indeed candy coated censorship.MicahMedia 03:21, 20 August 2011 (UTC)

Oh, we understand very well. "Wehret den Anfängen". What have you done against Hitler in 1932 is one of the questions which were asked in Germany. Some groups want to ban images - and thats censorship. --Eingangskontrolle 14:51, 19 August 2011 (UTC)


The obvious reason why it will not work

The decision process is based on the "wiki" system and everyone knows nothing good can come out of a wiki (which is based in people imposing their references as more reliable than the others). 190.175.201.164 20:15, 19 August 2011 (UTC)

Educational materials

I don't understand what "educational materials" can mean in this page. Materials to educate the voters? Nemo 10:47, 1 July 2011 (UTC)

So that they could make the right choice... ;-) --Mikołka 10:58, 1 July 2011 (UTC)
An image or entire set of filter options might help suit different groups.```` — The preceding unsigned comment was added by Marsupiens (talk)
this is a little late, but "educational materials" might be an inadvertent US-centric edit -- in the US every voter gets a packet of information about the upcoming vote, who is running, what the full text of the ballot referendums are, etc. before every election. I always assumed this happened other places too until my UK friends were in town before an election and expressed amazement. At any rate, that's the kind of educational materials that are meant -- to answer the question "what's going on"? -- phoebe | talk 18:37, 31 July 2011 (UTC)


Phobe, thanks for the explanation. Yes, it's an US-centric thing, I never heard of it anywhere in Europe for example. --Dia^ 07:59, 16 August 2011 (UTC)


Spam/Canvassing

Please do not send spam: people fill in their e-mail address to recover passwords and some other user configurable stuff. Do not use this information to send unsollicited e-mail. Also: the vote is already underway, and this could be construed as canvassing. Especially because the text is not neutral, but rather very positive about this (ridiculous) proposal. Zanaq 17:58, 18 August 2011 (UTC)

The e-mail will only be sent to those who have set their accounts up to receive e-mails from other users. You could also opt-out of all such mailings if you would like to. It's not canvassing — it's an e-mail sent notifying people of the referendum. It was always planned that we send one, just like we do with Board elections. Everyone has a say in this who meets the criteria, not just the people who visited Wikipedia when the banner was running. Cbrown1023 talk 19:00, 18 August 2011 (UTC)

Voting for contributors.

Having contributed, it seems to me that I have given evidence of a greater interest in the future of Wikipedia than others who have not and perhaps should therefore have earned voting privileges.

You should have indeed! Voting for contributors is the general idea. But it's pretty hard to keep track of whether you have contributed or not if you don't create an account and log in to make your edits. -- phoebe | talk 15:20, 19 August 2011 (UTC)
But you will have to admit that the text is written in a very biased and not anywhere close to neutral way. --Niabot 19:07, 18 August 2011 (UTC)
Canvassing is prohibited for straw polls and deletion discussions. This is a banner-advertised movement-wide decision. We want every possible voice involved. My understanding is that you can talk to whomever you like about the issue, until a foundation authority tells us not to. (Though yes, everyone go easy on sending unwanted email, try to be neutral, and follow the spirit of No Canvassing as much as possible.) --AlecMeta 19:19, 18 August 2011 (UTC)
It also goes out to all eligible voters, not just people we think will vote the way we want. If we only sent it to Alec and his friends because we thought they'd support it, but didn't send it to Niabot because we were afraid he'd not like, that would be a problem. Cbrown1023 talk 00:30, 19 August 2011 (UTC)
When I gave my email address for my Wikipedia account, I did so that other users could contact me, not that the Foundation could send me a bot-produced mass-email. The Foundation is not another user. --94.134.209.82 07:25, 21 August 2011 (UTC)

I have not received a personal invitation yet. Perhaps thats because I was against Vector? --Bahnmoeller 18:23, 19 August 2011 (UTC)


Sorry, you cannot vote in this election if you are blocked on 2 or more wikis.

When I try to access the voting page I get the message "Sorry, you cannot vote in this election if you are blocked on 2 or more wikis." This is not particularly helpful; I'm not aware of being blocked on any wiki. I imagine other users may be experiencing this problem. Could the message please be updated to indicate which wikis the account is blocked on, so that the user can investigate and remedy the situation? In the meantime, could someone please let me know how to find out which wikis are blocking my account? —Psychonaut 12:29, 19 August 2011 (UTC)

This page will give you the information you're looking for. Werdna 12:32, 19 August 2011 (UTC)
Thank you. If it's not possible for the "you cannot vote" message to indicate which accounts are blocked, then it would be useful if it at least provided a link to that tool so the user could check himself. —Psychonaut 12:49, 19 August 2011 (UTC)


Can't seem to vote

Hi, I got an email asking me to vote in this, but when I go to the page where I'm supposed to, I get a message that I'm not in the list of people eligible to vote. What am I doing wrong? Psu256 16:00, 19 August 2011 (UTC)

I'm having the same problem. I'm an active user on the English Wikipedia with over 12,000 edits to my name, I have a global account and I'm not blocked on any projects to my knowledge. However, the system says I'm not on the list of people eligible to vote. What's going on here? elektrikSHOOS (talk) 16:20, 19 August 2011 (UTC)
Nevermind, I was trying to vote from meta. Psu256, you have to vote from the wiki project you're most active on: so, if it's the English Wikipedia you're active on, you have to go here. elektrikSHOOS (talk) 16:25, 19 August 2011 (UTC)
That wasn't exactly easy to figure out :) Even when you gave me that link, I wasn't signed in, and when I went to sign in on the secured server, it took me back to the main page rather than the special page for SecurePoll. Psu256 17:49, 19 August 2011 (UTC)
If you continue to have problems, please note them at Image filter referendum/Email/False positives. Even though this is not exactly what that page is for, I'm told it's the place to report issues of this sort as well. :) --Mdennis (WMF) 16:30, 19 August 2011 (UTC)
It wouldn't let me vote the first time I went to the page, but I went back a few minutes later and it worked. -- Gordon Ecker 21:22, 19 August 2011 (UTC)
Actually, never mind, it didn't work because I thought this was Wikipedia and went to Special:SecurePoll/vote/230 instead of w:Special:SecurePoll/vote/230. -- Gordon Ecker 21:24, 19 August 2011 (UTC)
I think part of the problem is that it isn't obvious that you've navigated away from Wikipedia to the metawiki if you aren't paying close attention when you click on the banner that takes you to this content page. Especially when you are used to using the secure server - the URL at the top still says https://secure.wikimedia.org/wikipedia. The instruction to go to your "home wiki" isn't very helpful if you don't know how... Psu256 23:18, 19 August 2011 (UTC)

What is this stupidity

I've received an invitation to this referendum three times. I have no understanding of the background, but I guess that the authors of this initiative have too much free time. Why would you want to resolve a contentious issue with a popular vote? It should be resolved by argumentation and reasoning, unless you want to force popular stupidity on everyone. There's no reason to poll someone like me who's been inactive in any editing for years. Also, since it turns out I have multiple accounts (all inactive for a long time, at least one should have been inactive for 7+ years), am I allowed to vote three times? Whoever started this "referendum" should be sacked if only fr the volume of junk email that must have been generated. 84.230.245.50 19:49, 19 August 2011 (UTC)

Since posting this comment I got even more invitations, and Gmail is now automatically marking them as spam, even though I haven't told it to. Way to fail. 84.230.245.50 20:52, 19 August 2011 (UTC)

What is the question being asked?

In referendums, there is normally a single yes/no answer for those being polled to answer. The announcement doesn't give any hints as to the phrasing of this question - is there a draft available, and/or will the draft question be posted before the poll for analysis/comments/discussion? Or will there be multiple questions or multiple answers so that a diverse range of voter viewpoints can be sampled? Thanks. Mike Peel 20:51, 4 July 2011 (UTC)

I also believe that there should be a suggested implementation (or set of implementations) set out prior to this referendum, given the range of different possibilities available for such a filter. Is this the plan, or will the referendum be on the general issue rather than the specific implementation? Mike Peel 20:59, 4 July 2011 (UTC)
The referendum text will be published in advance. It will likely be a yes/no question (perhaps with an option for "I don't feel that I have enough information", as with the licensing migration). The referendum is on the general concept, as much as is humanly possible. Philippe (WMF) 01:25, 5 July 2011 (UTC)
I appreciate the idea that we are allowed to see the referendum text before the actual casting of votes. However, in my humble opinion, we should also see the text - in its final form, or (if N/A) as a draft - early enough to be able to ponder and discuss it. Therefore, I hope that "In advance" means "Asap", and, more particularly, at least four weeks before the start of the referendum.
Among others, it may take some time to translate it to the main languages; and, when it comes to finer nuances, most non-native speakers of English would benefit by having access to a good translation. JoergenB 19:21, 5 July 2011 (UTC)
What's the point? I get an email invite to vote. I read the proposal, then attempt to vote. Instead I get "Sorry, you are not in the predetermined list of users authorised to vote in this election." I would like to complain about that, but in the past any attempt to point out a Wikipedia problem has been met with threats by the Wiki-Nazis to be banned from the system. Wikipedia is slowly rotting from the inside out.--NateOceanside 16:45, 19 August 2011 (UTC)


Biased referendum

In addition to the fact that this aids censors, it is clear to me that whoever is in charge of this "referendum" is also heavily pushing for it. I don't remember to see a voting more biased than this one here (perhaps because Wikimedia projects usually thrive for consensus, not direct voting)

Why on hell there is no argumentation opposing it on the description and the FAQ? Or is this too awesome to have significant downsides?

But if this such change is so great to the point that its opposition is unimportant, why the heck is it put on referendum at all?

What people were thinking, that by taking this one-sided referendum they could claim there is "broad support"? This bias goes from the questions asked (that looks like the matter is basically decided) to the FAQ. It is disgusting!

I have put a more involved opposition against tagging of sexual content itself, on the 2010 study page. Summarizing it here, I believe that a culturally neutral definition on what is sexual content is basically impossible, and even if it existed, there is no objective way to tag such images.

--187.40.208.90 04:46, 19 August 2011 (UTC)

In fact, it seems it was already decided by the powers that be --187.40.208.90 05:08, 19 August 2011 (UTC)
Are there any questions on the referendum that have not already been decided? — Internoob (Wikt. | Talk | Cont.) 20:12, 19 August 2011 (UTC)

Meaning of the Referendum

I read the questions as listed now as "We've decided to implement this feature, so how should we do it?", and not as "Should we implement this feature?" Is that a correct reading?--Danaman5 06:06, 4 August 2011 (UTC)

  • Same feeling. "If you say it's important, perhaps we will do it. If you say it's not important, we will do what we want to do.". Meanwhile, who will grade the images? The Board? You? Me? Taliban? Are we invited to "Meet Your Censors" party? NVO 08:55, 4 August 2011 (UTC)
    NVO, the current idea is that (if implemented), it would rely on the category system. This means that you could say that you didn't choose to have anything from commons:Category:Abused primates (for example) appear on your screen (by default; you can override it at any time). WhatamIdoing 19:45, 4 August 2011 (UTC)
    It is? Are you sure? Could someone please verify if this is, in fact, the "current idea"? --Yair rand 05:49, 8 August 2011 (UTC)

I would really like an official response to this from someone involved in planning this referendum. Is it already a certainty that this feature will be implemented, or not? If not, the questions really need to be revised.--Danaman5 01:00, 7 August 2011 (UTC)

  • Hi Danaman5, yes, the Board passed a resolution asking for this kind of feature to be developed; so that's a given. The questions are asking for community input in how it is developed and implemented. -- phoebe | talk 05:00, 15 August 2011 (UTC)
Rely on "the" category system? Many wiki projects have their own image uploads and their own categories, so we are dealing with many category systems. Currently the Commons category system is really quite an internal thing, the images displayed in an article on any other project are not affected in any way by what we do to the category system, images are recategorized all the time, category structures are modified all the time, if filtering relies on the commons category system we can now cause huge disruption very easily in our everyday editing. We are currently going backwards as regards getting new uploads properly categorized, I doubt the extra energy that will go into maintaining categories that comply with the filtering requirements will help that problem. I suppose an optimist would hope that the new emphasis, put on Commons categories, would mean that more people would come to Commons to help. But I expect that the filtering system will attract zealots trying to censor material by categorizing images that they (and no one else) want hidden, and given the current backlog in categorization, they will not be rapidly reverted. --Tony Wills 09:22, 9 August 2011 (UTC)
Ok, I see from the Personal image filter page that the idea is to have a separate flat category branch for filtering, that would allay my fears (it might cause other problems, but wouldn't impinge on the current categorization of files). --Tony Wills 10:59, 9 August 2011 (UTC)

The feature can be developed, but it can't be implemented on any particular wiki without community consensus there. If the board of the foundation and a wiki are at odds with each other, there is always the right to fork, of course. --Kim Bruning 15:33, 19 August 2011 (UTC)

WMF "censorship" log

The Image filter referendum page on the Meta wiki is misleadingly titled, since it suggests that Wiki(m|p)edians have the possibility of rejecting a decision already taken by the WMF Board of Trustees to implement an image filter feature. After less than half a day of discussion of a community proposal to use a more accurate page name, an Associate Community Officer:

  • removed the move (page rename) proposal from the page prior to community consensus, in contrast to normal practice: diff
  • removed text within the page that clarified that the consultation/survey is not a referendum and that there is no (unambiguous) option to reject the planned feature: diff

and the WMF Head of Reader Relations froze the confused version of the page against editing by anyone other than sysops:

If the proposal page title and the proposal content were unambiguous, then there would be no need for community editing. Now that the pages have been frozen by people with official WMF roles (employees?), the community is left with much fewer of the usual consensus-building methods that are normally possible on the Wikipedia.

Does this count as censorship? IMHO it's not "hard" censorship, but it does feel like a soft form of censorship: the Board of Trustees have made a series of errors and people in official WMF roles feel that the normal wiki culture no longer applies. Instead of correcting the error, it has been frozen in.

The Head of Reader Relations' user's page explains that the role represents the "vast majority of the users of Wikimedia projects: the readers who rarely or never edit". Are the vast majority being protected by being encouraged to give en-masse answers to a misleadingly-titled survey whose results will be ambiguous to interpret? Surveying the opinions of "the vast majority" is not a bad thing, but surveying the vast majority by something that is a bit like a Rorschach test is unlikely to result in any useful answers, only confusion that can be interpreted however one wishes.

Dear Board members: the only reasonable option that i see right now is to stop the present so-called referendum and reprogram a schedule for some time in the future after Wikipe(p|m)edians have had the chance to improve the proposal title and the proposal content to a state where it will be useful and include a clearly stated option of rejecting the feature. Better a minor embarrassment that is corrected than a plunge forward into a mess. A Rorschach test is not a referendum.

Boud 21:06, 17 August 2011 (UTC)

I don't think we should move the proposal title in the middle of the referendum; I wish you would stop trying to, in order to make a point. That's also poor process. -- phoebe | talk 13:43, 18 August 2011 (UTC)
The only poor thing is the behavior of the board. The titel was misleading from the start. Resign and let us vote again. --Eingangskontrolle 11:45, 19 August 2011 (UTC)

How to express a "no"

I'm unclear about how to express a "no, we should never do this," on the actual referendum. Obviously, the first question, about how important it is that we adopt it, is easy to answer. But the others are about how important we see various features; what's clearer, a "0" for all on the basis of neither it in general nor its features should be adopted, a "5" for the features, on the basis of it doesn't make any difference what the features are if we don't do it, or an "9" for some or all of the features, on the basis that if the consensus is to adopt it, or the board forces us to despite our wishes, then at least we should have the least disruptive filter possible. I'm concerned that a zero for all might be taken as meaning that, after the possible decision to adopt, and we start considering the individual features, my vote would be added to those who think we should not include them in particular. DGG 16:32, 18 August 2011 (UTC)

There is no referendum (even if called as that) and there is no "No" option. The best way to express a "No" that i could find was to reply with "0,0,10,0,?,?" and an additional comment that I'm strictly against any kind of content filtering. --Niabot 16:37, 18 August 2011 (UTC)
Yeah, I too think this should have included a "no, we should never do this" option. I say this because the question "Is it important for the Wikimedia projects to offer this feature to readers?" isn't very meaningful. How important is it? Well, really important if you're strongly in favor of the plan. Of course, someone who's strongly against the plan will put a zero, but that doesn't really express their position. --Phatius McBluff 16:43, 18 August 2011 (UTC)
I'm also a little confused how to 'score' my position:
  • Worst possible outcome: a non-culturally-neutral filter, showing us willing to risk lives to offend muslims but unwilling to offend the Christians who pay our bills. This would be a disaster.
  • 3rd best outcome: the 'proposed' filter is implemented if it's a true strategic priority.
  • 2nd best outcome: the 'perfect' filter is cheap, we build it and show it off.
  • Best possible outcome: Nobody wants the filters and neither do I! Unlikely.
It's not a problem for me-- I write in English, so I can have more effects with my words than most users can with their poll vote. Voters in other languages may have a tough time translating their views into numbers, but every data collection has its limits. You live and you learn.
Don't worry. Based on my knowledge of science fiction, I believe no less than half our board and staff are on the 'fictional' nootropic "NZT". They're learning all the time. --AlecMeta 01:51, 19 August 2011 (UTC)


Why do we need to vote?

I'm just wondering why a vote is needed here. It seems like such a no-brainer to enable this sort of system; are there significant concerns that have been raised by others somewhere? fetchcomms 03:11, 30 June 2011 (UTC)

I agree; considering it is a personal choice anyways, where's the possible damage? I doubt that anyone will come complaining over the fact that they could activate this for themselves. Ajraddatz (Talk) 03:19, 30 June 2011 (UTC)
So it hasn't occurred to you that once all the content has been conveniently tagged and the blinkering system made standard, that there will be governments and political and religious groups makign powerful arguments that that they should have "copilot" control of the switch?
There are currently two main reasons why Wikpedia is relatively immune to calls for censorship from politicians and pressure-groups, (1) the sheer amount of work that would be involved in classifying everything, and (2) the inviolable principle that Wikipedia "doesn't do" censorship. Implementing this feature removes both hurdles.
If there's an established censorship system then parents ought to be able to enable it on behalf of their kids, right? So that establishes the principle that the "switch" can be applied on behalf of other people. It then becomes difficult for schools and libraries not to implement the switch on their machines, to prevent students and the public using their machines to access porn, so then you have the principle that schools and public bodies can filter wikipedia on behalf of users (regardless of whether those users are adults). If parents can switch on censorship on wikipedia for porn, some of them will say that they also have the right to decide what other harmful influences they want to protect their kids from, and then, if you're a kid growing up in an oppressively religious or political community, you can't find outside information telling you that alternative views are possible, because they've been blocked. And how do you then argue that there shouldn't be other "blinkered" versions of wikipedia that are especially friendly to white supremacists, or neo-nazis, or to the Iranian government, or North Korea, or Holocaust deniers?
Like the saying goes, the price of freedom is eternal vigilance. If you don't see the potential problem in agreeing that others can set up an entire system of censorship infrastructure designed to restrict what you can see, and you think it's okay because you assume that having given them consent to build the thing on your behalf, that they're going to keep their word, and somehow resist political or legal attempts to wrest control of the switch, then I think that's startlingly naive. The only safe way to guarantee that politicians and lawyers and pressure groups aren't going to get control of that wonderful, tempting powerful switch is ... to not build the damned switch in the first place. ErkDemon 22:31, 17 August 2011 (UTC)
1) If there are controls, I am also pretty sure they'll be used for other purposes. There are a lot of people around who want to tell others what to do.
2) There seems to be an assumption here that images are optional and that the article makes complete sense without them. I find images essential in many cases. Where an important number of articles make little sense without images, isn't the idea wrong headed from the start? If going down this road, "censor the whole article" might be a more coherent approach. (I don't recommend that though.) Images are not always optional!
3) For an editor who makes a contribution periodically, but doesn't learn all the detail of the system, this may be enough of a further burden to prevent contribution. That's a bad thing, in my view.
4) There is a "contract" when editors contribute. Anything that retrospectively changes that contract will cause unease and some will consider legal action too. Mike Gale 22:49, 21 August 2011 (UTC)
I am amazed that such a system is even being considered, let alone brought to a vote, as though there was any chance of this gaining consensus. Such a system is a blatant annihilation of Wikimedia's neutrality, placing preference towards the viewpoints of certain cultures. It is impossible for such a filter to be anywhere close to neutral, short of having commons:Category:Topics as the start point for settings. My guess is that any potential filter the foundation has in mind will satisfy the opinions of a small group of people, mostly Americans and Europeans, and ignore everyone else. --Yair rand 03:29, 30 June 2011 (UTC)
I'm also thinking I'll be against this proposal, but how about we wait and see what's actually being proposed before we start engaging in any wild speculation? Craig Franklin 08:26, 30 June 2011 (UTC).
Wikipedia is not an ideological vehicle, Wikipedia is a tool for education. I strongly support this feature (although I won't use it myself) because it will result in more people using Wikipedia. I'd vote, but for some reason my Wikipedia login doesn't work on Wikimedia and the new one I created isn't eligible to vote. Unigolyn 08:45, 18 August 2011 (UTC)
Well the comments above shows why we need a discussion/vote to bring such system into projects. --Bencmq 08:34, 30 June 2011 (UTC)
There will definitely be opposition to this proposal, myself included. I wouldn't be so quick to call the result just yet. Blurpeace 09:13, 30 June 2011 (UTC)
AGAINST: 1) In all the years I've been using Wikipedia, I have never needed any such feature. Neither has my 8 year-old son. 2) I do not believe blocking fuctionlity adds value to Wikipedia; I feel very uneasy about this project. 3) I view this as essentially cataloging the entire media-wiki content according to whether it is censorable. And yet Wikipedia is impartial - what is there to block? 4) I do not think our children will benefit from this. -- M-streeter97 21:00 21 August 2011 (EST)
I'm not quite sure how it affects Wikimedia's neutrality. It says right on the main page for this, "allow readers to voluntarily screen particular types of images strictly for their own account." From that, I presume anonymous readers would still get the same content they do now, and if you don't want to filter anything (or filter everything), you would be free to do so with your account. I believe this is about giving people the tools to filter the content they don't want, rather than deciding en masse that certain content should be hidden/shown. ^demon 17:24, 30 June 2011 (UTC)
"giving people the tools to filter" - it's not about what you do on the receiving end. It's about how "they" tag (or not tag) files at source. There are 10 million files on commons. There are no human resources to assess them (have you seen the backlogs?). But there are well-known problem areas: I'm afraid that added censorship rights will become a POV-magnet for determined POV-warriors. Welcome to filter-tag wars. Larger wikipedias have some experience of handling conflict areas, commons is totally unprepared. Let's hope that the new widget will be as helpless and useless as their "upload wizard". But if, indeed, they roll out something working, prepare for the worse. NVO 18:04, 30 June 2011 (UTC)
People will not be free to filter what they want, such a system would be pretty much impossible to make. They will be free to filter a set bunch of groups of images that the WMF have received the most complaints about from our current English readership (well actually, probably not the most complaints, the Mohammed images have received loads, but I doubt the filter's going to include an option for filtering that simply because it would make Wikimedia look like the non-neutral group it will have become). There are endless amounts of groups that have their own filtering needs which will obviously not be satisfied by this system. --Yair rand 22:16, 30 June 2011 (UTC)
Why would this be impossible to make? I don't see why everyone shouldn't be able to choose an arbitrary set of [current] Commons categories that they would prefer not to see. Would you be opposed to such a system? SJ talk | translate   03:29, 1 July 2011 (UTC)
If it really gave no preference towards certain cultures (making just as easy to hide images in Category:Tiananmen Square protests of 1989 or Category:Asparagus as images from any other categories, perhaps having selection beginning from Category:Topics as mentioned above), then it would not be a problem with NPOV, in my opinion. --Yair rand 04:36, 1 July 2011 (UTC)
It would be better as you suggest, but we would still have the problem that we don't agree on categorization. Nemo 10:42, 1 July 2011 (UTC)
Yair: thanks for clarifying. Nemo: true, but we already have that problem with applying categories. [re]categorizing media would become more popular as a result, but that might be a good thing just as it is with article text. SJ talk | translate   03:05, 3 July 2011 (UTC)
I, for one, am amazed that anyone at all would defend the people who upload thousands of pictures of their dicks, or shit on plates and put images of them into articles. Though it's worth noting that people have resisted efforts to even stop the dick-posters in the past. 75.72.194.152 14:38, 19 August 2011 (UTC)
I find myself agreeing with the opposition on this issue. Regardless of who decides what is to be subject to censorship, and regardless of the fact that it is opt-in, it still puts the Wikimedia Foundation in a non-neutral position. Taking an example from my own area, what if we deemed images of the Tiananmen Square protests of 1989 or images of leaders of a certain religious group banned by the Chinese government to be "controversial", and therefore added them as an option on the filter? We would rightly be labeled as non-neutral for supporting the Chinese government's view of what is and is not controversial. I don't want us ever to be in a position like that. We are a neutral provider of knowledge. individuals already have the ultimate filter built in: they can simply not look at what they don't like.--Danaman5 23:44, 30 June 2011 (UTC)
Things are controversial if some people think it is. So yes, we should add a tag to those images if some Chinese think they are offensive. Anyone should be able to come in and declare an image offensive. Only then it would be neutral. Of course, a lot of images would be deemed controversial that way. Zanaq 09:38, 1 July 2011 (UTC)
Agree with Zanaq. If someone thinks a picture of a naked person is controversial that person should be allowed to add a filer, a symbol or whatever s/he likes to change the picture or the way it's linked. You want to give work to some filter committee so they have no time to improve Commons, Wikipedia or whatever project? Not a brilliant idea me seems. Cheers, 79.53.129.80 11:28, 1 July 2011 (UTC)
hello may i add a thought on it?

the thing is any encyclopedias who host image pornography are misguided and betray trust in the first place or else it would be merely an adult site not a genuine encyclopedia of merit. Wikipedia requires morality regardless of the wide spectrum of user-opinions or other filters. users that post anything biased need to be expected to have their item debated in the sandbox. Regarding China they should be allowed for a country to choose what not to see if a person can. In China the people there did not get to vote for any other party so that is their choice. Make ready-made choices for the whole WP and images to be aware of some nations whose Government needs an option to choose to block things. When WP is obviously not allowed and WP would be banned due to open-content, then WP ought to offer a strong way to securely Select PG or China-friendly content filter or other dictator-styled filter set in stone and compulsory mandated in WP architecture as a location-based filter including for their anonymous users. Otherwise there are whole countries of people who get none of the WP articles because of an all-or-nothing preponderance. WP policy of one-size-fits-all means WP may deny people their right to see what is allowable in the place they live. Simply because of inadequate filters people must wait. As well as supplementary personal filters even a Saintly or an Atheist filter although those may remove most of the articles on human history. A China-tickable filter or Vegan-friendly locked-down WP made filter, could be popular selections and bring new users. Basically this might allow people to create a custom-filters gallery that are like mozilla addons for firefox. The possibility is about better WP article usage by widgetising several search terms for example so a search delves following the right learnt-link rather than searches that fan-out everywhere. Marsupiens 06:12, 16 August 2011 (UTC)

This has been talked about for years. The idea that has emerged is that there would be multiple, separate filters, so you could pick and choose. The most commonly requested ones are for pornographic images (requested by basically everyone in the world except white males), religiously sensitive images (notably but not exclusively images of the Prophet Mohammed), and violent images (e.g., the picture of a man in the act of committing suicide).
I suspect that most of us would not choose to have any of them turned on, but the opposition's story to this point has generally been that if people don't want to see these images, then they should write their own third-party software (maybe a browser plug-in) because WMF should never-ever-ever offer options even remotely like Google's "safe search". In this story, enabling non-adepts to voluntarily customize their experience (e.g., by permitting a person with PTSD to suppress images that might trigger symptoms) is pure, evil censorship. WhatamIdoing 15:44, 1 July 2011 (UTC)
I'm a white male but would happily support allowing people to opt in to various filters. including grades of porn. But we need the classification to be transparent and the filtering to be genuinely a matter of individual choice - otherwise we would be accused of bias. Some people would define porn very narrowly, othere would include bare arms, legs or even female faces. I suspect it would be impractical to create a filter per every individual request, though we could go a long way towards that if any category could be filtered out by any editor. I know someone who would opt to filter out images that contained spiders. I think we should also allow groups of Wikimedians to create custom flters that people could choose as opposed to going through a ream of options. So presumably a moderate Shia filter would set the porn threshold at bathing suit, ban the cartoons of Mohammed, and perhaps images of pigs and pork? Whilst a "western cosmopolita child friendly filter" would enable you to filter out graphic violence and set the porn filter at bathing trunks. The difficult questions we have to resolve before the referendum include:
  1. What do we do if particular governments say they would be willing to unblock Wikimedia if we allow then to filter out what they consider to be porn or of religious offense (I'm assuming that we won't cooperate with countries that want to filter out images that might be politically controversial).
    Suggestion: say No, just as we do today.
  2. What happens in discussions on Wikipedia if some people want to illustrate an article with images that others can't see becuase they have opted to filter them out? My preference is that we enable "alternate image for those who've filtered out the main image".
    I believe the current proposal is to have the image visible hidden, the way sections are rolled up; so it is clear that an image is there but not being shown, in case the reader wants to see it.
    If it is planned to work that way, I wonder how it will work with anything besides the mere use of images for illustrating articles. In the Portuguese wikipedia we use a painting of a naked woman as the symbol for our Art portal, embedded in a template and displayed at every artwork article. I'm sure the more puritan would like it to be hidden from view as well.--- Darwin Ahoy! 12:16, 3 July 2011 (UTC)
  3. What do we do when people disagree as to whether a particular image shows enough breast to count as topless whilst others point out that there appears to be a diaphonous top? Do we give filters the right to fork?
    This is already an issue that comes up in categorization of images. Policies tend to be vague at present; if anyone can create their own filters then you could define a combination of descriptive categories, each of them relatively neutral, that fit what you don't want to see. (there could still be disagreements about how many different similar categories are approriate for Commons. For instance, different people with w:ailurophobia will draw different lines around what counts as 'a cat', from the perspective of their reflexive fear.)
  4. Do we use this as an opportunity for outreach to various communities, and if so are we prepared to set boundaries on what we will allow filters to be set on. For example I don't see that a porn filter is incompatible with being an educational charity, but I would be horrified if a bunch of creationists tried to block images of dinosaurs or neanderthals, especially if they did so as part of a "Christian" filter (I suppose I could accept individuals knowingly blocking categories such as dinosaur). WereSpielChequers 09:54, 3 July 2011 (UTC)
    I can't think of a neutral way to set such boundaries. Personally, I would be fine with any arbitrary personal choices people make in what images they want to see -- dislike SVGs? fine! -- just as in what pages they watchlist. I see this as a tool to support friendlier browsing, not an occasion to set new official boundaries. SJ talk | translate   11:42, 3 July 2011 (UTC)
    The reason we need to vote is because many people, myself included, are against any violation of the basic core principle of NOT CENSORED. There are three legitimate ways to avoid images 1/ turn them off in ones browser 2/ fork the encyclopedia to a version that omits NOT CENSORED 3/devise an entirely external feature making use of our ordinary metadata. What is not acceptable is for the WMF to interfere with the content in this manner. The fundamental reason for not censored is the general commitment to freedom of information as an absolute good in itself--freedom means not just nonremoval of information, but also not hindering it -- the practical reason is because no two people will agree on what should be censored. DGG 13:53, 16 August 2011 (UTC)
But how can this be censorship when there is a button on every hidden image to show it - and when each individual has the ability to simply turn off their filters? Censorship is when we hide an image in such a way that people can't see it when they want to do so. I would be strongly opposed to censorship - but this isn't that - this is giving people control of what they prefer to see...it's no different from allowing them to close their eyes or look away if they see something distasteful in the real world. SteveBaker 18:54, 17 August 2011 (UTC)
The buttons are fine, I think. If all we needed were some buttons, I'd be all for it. The problem might be the exact way we implement how those buttons work. If we do it wrong, we automagically enable censorship by third parties. And it seems to me that it is actually very easy to do it wrong, and very hard to do it right.
The community consultation as stated in the board resolution needn't (and probably shouldn't) really be a referendum on what trade-offs devs should make. What is really needed is a discussion on "how do we do this right". If there happens to be a way, then we can implement it. If together we can't find a safe way to implement the feature within the framework of the board resolution and our founding issues, then we can provide feedback to the board to that effect. --Kim Bruning 23:34, 19 August 2011 (UTC)

Alternative ideas

low-res thumbnails

I've commented on this before, but will repeat: I think it would be more generally beneficial to allow users a setting to override page settings about the size of thumbnails, so that, for example, you could decide for all thumbnails to be shown at 30-pixel resolution (also for all images to be shown as thumbnails) regardless of the Wiki code. This would help low-bandwidth users as well as those with specific objections. My hope is that at some low resolution - 20 pixels if need be - there is simply no picture that will be viewed as intensely objectionable. I wish your referendum would investigate in this direction rather than pressing for people to "neutrally" place ideological ratings on specific images. Wnt (talk) 23:56, 30 June 2011 (UTC)

This would be an interesting low-bw option, but seems like a separate issue. If an image is clear enough to be better than no image (the ultimate low-bw option, after all), it is clear enough to be controversial. 'Placing ideological ratings on images' would be an unfortunate outcome. SJ talk | translate   03:29, 1 July 2011 (UTC)
If we really need to take in to consideration the wishes of the more squeamish readers/editors, this seems to me the only viable option. I can see quite few advantages:
  • It would avoid all work to set up the the tagging and the software to implement it. Energies that could be better invested elsewhere.
  • Would avoid all the editing wars that I already see ahead and consequent lost of time and energy.
  • Would be an advantage for people accessing Wiki with low connection/old machines.
As only (tiny) drawback, the articles would not look as sleek and appealing as now. It seems to me a win-win solution. --Dia^ 07:48, 16 August 2011 (UTC)
The cultural/religious objections to the depiction of Mohammed have no limitations on image resolution. I agree that it makes no sense to object to such low resolution images - but I also agree that it makes no sense to object to full-resolution images either. The point isn't whether the objection seems rational or not - the point is that there is a very real objection for some people. It is perfectly possible for an organization to say "No looking at pictures of naked people on company time" - without having the common sense to say "No looking at pictures of naked people at high enough resolution to see anything interesting" - and for a religion to say "Even a 1x1 pixel image of Mohammed is objectionable because it is the intent of the depiction rather than the actual light falling onto the human retina that counts". SteveBaker 19:17, 17 August 2011 (UTC)


(assisted) browser plug-in

I agree with user Koektrommel "If people want to filter, fine, do it on your own PC." (somewhere below in this discussion), but also see user WereSpielChequers statement "As for the idea of people filtering on their own PCs, what proportion of our readers are technically capable of doing that, 10%? 1%? We need a solution that everyone can use, not just the technoscenti.".
Why not go an intermediate way: Making it easy to the user/browser plug-in to filter, but keeping it local at the users PC? If every picture sends its category/tags with it, a local browser plug-in can filter easily. How complicate it is, to use this plug-in (or a concurrent one) is problem of the plug-in developers (and the user to chose an easy-to-use one). (A reader unable to install a plug-in should perhaps think about absolving a lerning lesson about using internet...)
Tagging and categorising of pictures remains task of the community; but perhaps it could be less 'absolut', e.g. relative values like: image-showing-almost-naked-woman: "67% for ~yes, she's close to naked~ / 33% for ~normal light clothes~" (123 users rated this picture)
--129.247.247.239 08:57, 16 August 2011 (UTC)

Congratulations on your retrograde help to implementing image tag wars. WP should have no involvement in subjective image tags, as it will be the beginning of the end of our works. If an image is not defined illegal by law, it should automatically be permitted to reside on Wikipedia unhindered and untagged by people's opinions of whether a woman's nipple is exposed or covered, or whether the woman exposing the nipple is a 'naturalist' or a 'slut'.
If an image is inappropriate to an article, by all means remove it, however Wikipedia should have zero involvement in image tags and filtering software. Wikipedia is not your nanny. Harryzilber 16:43, 17 August 2011 (UTC)
All users without the (active) plug-in see all images. I don't accept the pseudo-reason "beginning of the end of our works". As far as I see, this whole discussion is about a method to allow people's personal opinion - if WP wants a possibility, e.g. for Moslems to mask "naked" bodies, you'll have to enter the ring of opinions and fight (what's too naked, what's acceptable, ...)
To "WP is not your nanny": At the moment, the community regulates what's the NPOV about a subject. I trust in the community to regulate what's the NPOV about a picture's categories. Whether this democratic opinion is a boolean one (in the category or not) or a relative one (0..100%), the fighting ring is open in both cases. At the moment, the community regulates it, so WP already is my nanny - and we accept it, by using WP.
--129.247.247.239 12:10, 18 August 2011 (UTC)


Collaspable image

The "offensive" images are collasped. --tyw7 19:32, 19 August 2011 (UTC)

Suppress all images

Please choose another browser. Opera has such a functionality, Simply click on show images or no images, could this feature be implemented present in all other browsers? If we ask for it? 92.72.198.228 21:37, 17 August 2011 (UTC)

Yes all major browsers have this functionality. Rich Farmbrough 23:03 17 August 2011 (GMT).
Another way would be to remove any browser. Now you can be lucky that you won't find any disturbing content anymore. At least inside the Internet. --Niabot 00:20, 18 August 2011 (UTC)

Scope

If I understand this proposal correctly, the settings are to apply to each user's own account(s) and are customisable. I do not see this to be a problem as it would entirely be a matter of personal choice. However, assuming that non logged in users are able to see the full gamut of images from the smutty to the explicit, I fail to see the point: Creating such a system, which may or may not include the rather elaborate protection suggested system requiring passwords, seems to be technology for its own sake. Something that can be circumvented by simply being logged out has to be pointless. --Ohconfucius 04:56, 16 August 2011 (UTC)

Agreed! --81.173.133.84 08:12, 16 August 2011 (UTC)
Logged-in users are not required to see the full gamut of images. The mock-ups on the attached page all "show how the hider might look to an anonymous user", not a logged in user. It's likely that the system would be cookie-based for logged out users. If readers want something more permanent, they should create an account and have their settings be saved. Cbrown1023 talk 15:17, 16 August 2011 (UTC)
If I understand the proposal correctly, it is meant to be entirely voluntary. Some one who wants to circumvent the image blocks can simply change his preferences or not opt in in the first place. What does circumventing by logging out have to do with anything?--166.250.0.49 05:39, 17 August 2011 (UTC)Looks like i wasn't logged in.--Wikimedes 08:17, 17 August 2011 (UTC)
  • I still can't wrap my head around how this software filter will work under this proposal. The only way I see this working effectively is to have protection fully switched on as default, even for IP, with only logged in users who are able to opt out by disabling protection in their preferences. So it seems that the choice is between an effective system and one that violates IP and ordinary users' real choice. --Ohconfucius 16:22, 19 August 2011 (UTC)

Design

Block all images option

The only image filter that I would really like to see would be one that blocks all images, as long as I could unblock them by a simple click (or right click then select). (I know that this can also be done in the browser or with plugins such as adblock, but each presents problems: Many web sites use images for links, so blocking images can make navigation difficult, and sometimes collapsed images distort the web page. Unblocking images blocked by a wildcard filter in addblock can sometimes be a pain too.) I spend most of my time in Wikipedia reading history articles, where images add little information content and seem to be there mostly to enhance the user experience. For people with a slow internet connection (I was still on half speed dialup a few months ago), not having to load images really helps navigation speed. Does loading images each time an article page is visited use Wikipedia’s system resources?--Wikimedes 07:09, 17 August 2011 (UTC)

Different idea for categorization

Instead of trying to categorize the infinite ways that people can find images offensive, how about allowing people to block images on all pages from a certain project or project category. People who don’t want to see images of battlefield carnage could block images on Military and Warfare or death pages, people who didn’t want to see depictions of Mohammed or the Crucifiction (very violent) could block images on religion pages, people who don’t want to see images of tantric sex could block images on religion or Sexology and sexuality pages, etc. I could block images on history and society pages where they’re not usually necessary but leave them in Technology or Medicine pages where they’re often useful. Granted that if a reader’s goal is to block offensive images this type of filtering system might be even more porous than one based on objectionability. But a filtering system based on objectionability means that the Wikimedia Foundation is taking a position on what’s objectionable to the extent of creating objectionability categories. A filtering system based on project categories would avoid that.--Wikimedes 07:09, 17 August 2011 (UTC)

Allow people to make up their own categories

If someone hates images of cats, then shouldn't she be able to make up a category of such pictures and put them in that category so that they could be blocked on her computer? Rickyrab 17:59, 19 August 2011 (UTC)

Or, say, images of Newt Gingrich or of people whacking others with pots and pans, or of heavy traffic, or of bad abstract sculptures. Such categorization might require a lot of work on the user's part, but I figure it could be doable, by clicking a categorization button or a hiding button on a picture. Rickyrab 18:04, 19 August 2011 (UTC)


Practical considerations

I am a Wikimedia Foundation developer and I work frequently with Wikimedia Commons administrators. While I have given some input to Robert Harris when he was writing his report, I was disappointed by what was eventually proposed. Note, I am only speaking for myself as someone familiar with how Commons is run. I don't speak for the Foundation, nor do I disrespect the people involved in this proposal, as I think they are making a good faith effort to satisfy serious concerns.

I don't know which users will be satisfied by this. Censorious people won't be happy until the objectionable content cannot be accessed at all. If there are people who in good faith try to use this -- for instance, to have a classroom computer that only accesses a "clean" version of Wikipedia -- the students will get around it in seconds, via the button that we put right on the page. So I don't see who we are trying to satisfy here.

Furthermore, this proposed tool adds to the burden imposed on Commons administrators. Amended this complaint as Eloquence makes clear below they are contemplating a system that doesn't use the usual categorization systems, but something simpler for non-technical users.

Finally, it would be very difficult to implement this tool as designed. Amended since there was a discussion of which I wasn't aware, and the implementation is quite plausible. See Eloquence's statement below.

I don't agree that our problem is about images that are already correctly placed in the context of a Wikipedia or Wikimedia Commons page. Our problems are more about community members who intentionally misplace content. For instance, people who try to get the most shocking image they can onto the front page of a project. This boils down to a few people who are simply trolls, and a somewhat larger number who are I think are misguided about Commons principles.

It is also conceivable that search results might contain unnecessarily disturbing or shocking results. But this is far easier to deal with. This is a case where we are generating results dynamically, and we can easily filter for images that should not appear in a search (or that should have a kind of click-through-to-view interface).

I doubt there will be a need for a multi-dimensional categorization scheme here; we can simply have a single category "not for public areas / search results without confirmation". Reasons to categorize an image this way can be dealt with in the usual fashion, via the edit summaries.

These problems suggest different solutions, particularly a community effort to clearly define what is and isn't appropriate, new tools to boost the effectiveness and vigilance of Commons volunteers, and an effort to increase the numbers of administrators or recognize the important work they are doing. And such an approach would be necessary anyway; even if the proposal as defined was accepted.

-- NeilK 23:48, 16 August 2011 (UTC)

I think you gave us a nice conclusion. At first its not an easy task to implement this feature. At second its hard to decide which content should be excluded or not. At third it comes with a lot of more effort then most would imagine.
The implementation part is the first issue we would have to address. Given the fact that we already have enough technical issues this will stop the development on more needed features. But i guess that isn't the biggest issue, since we can "throw cats around" (stupidest thing/idea that i ever came across).
 
Good enough to be finalist of Picture of the Year on Commons. But, too offensive for English main page.
The question which content is objectionable an which not is a very difficult task which. I would never like to give it into the hands of the WMF, since i know how different the viewpoints are. In the end we would have hours/days/month spend on such decisions, which is in my opinion wasted time and effort. I added an simple example to right. While Commons and other projects never had an problem with the image, the English speaking Wikipedia denied it strongly. A picture that can't be featured, because it would be to controversial for the main page...
The third thing is vandalism. The admins already have a lot of trouble and many jobs to do. Now we add an extra job which isn't easy at all (see above). On top of that an admin would need to decide about a great variety of images. While experts of the topic would not see a problem, a admin might do it and the other way around. At the end we could make a poll about every image if to include/exclude or not.
I doubt that this would be a good approach at all. It's no final solution, it has much room for exploitation and it is a burden for all authors and/or admins. --Niabot 00:22, 17 August 2011 (UTC)
Niabot -- I agree with some of your points and you bring up an elephant-in-the-room problem. Commons admins and volunteers are by far the most hostile to anything that looks like censorship. And they have different values even from the English Wikipedia, let alone Arabic. So where are the volunteers going to come from, to categorize and filter images? NeilK 00:43, 17 August 2011 (UTC)
Hi Neil! I am glad you brought up better tools for Commons; you are someone who is in a great position to help define and build such tools. The last part of our resolution that led to all this in fact says "We urge the Wikimedia Foundation and community to work together in developing and implementing further new tools for using and curating Commons, to make the tasks of reviewing, deleting, categorizing, uploading and using images easier" -- and to my mind that's perhaps the most important bullet! What tools would you imagine? (And if this design for this feature won't work -- ok; it will have to change. I don't think it's gone through full technical review, you're right.) -- phoebe | talk 00:37, 17 August 2011 (UTC)
Phoebe -- As for categorization tools, I don't have a clear idea about what needs to be done next, but I do know that it's a huge pain point at the moment. NeilK 00:56, 17 August 2011 (UTC)
A clarification regarding technical assessment. Prior to Brandon's design, Brandon and I talked through the technical requirements of a personal image filter with Roan Kattouw, an experienced WMF engineer. The strategy recommended by Roan was to embed filtering information (such as a small set of relevant filter categories) in the HTML/CSS, allowing them to be cached, and then doing the filtering via JS. This avoids any "phone home" calls on the client; the server-side lookup should be a fairly efficient DB query. The parser would simply add relevant image category annotations to its output (not for all categories; just for ones relevant for filtering). This strategy is similar to the one proposed by Aryeh Gregor (another experienced MediaWiki developer) here mw:User:Simetrical/Censorship.
We discussed the issue of DOM flickering; I believe Roan stated at the time that it would likely be avoidable in most browsers provided the JS is executed at the right time, but this is something we'll need to assess in more detail. For browsers without JS, the filter option would simply not be available (and it could also be turned off in browsers which do not support it well).
I agree that the correct categorization of images on Commons is the main mechanism through which a filter would succeed or fail, and this is both an issue of efficiency of categorization tools and arguments around classification. My own view is that it's likely advisable to look further into simple tools for readers to report potentially objectionable content, which could be used either to suggest classification (through a hopefully simplified UI), or to automatically derive it using a scoring algorithm. That would help to get correct classification in places with high viewership, which is where it arguably matters most.--Eloquence 01:02, 17 August 2011 (UTC)
Okay I have amended the rant above. NeilK 01:05, 17 August 2011 (UTC)
I will reiterate the points I made in the collapsed section. Virtually every image is offensive to someone. The Ten Commandments forbid, under certain readings, images of actual things (in the heavens, on the earth, under the earth and in the seas). Various hadiths prohibit illustration of animals and people. Certain Jewish sects avoid picturing women. And that is just views from mainstream religions (albeit not incredibly widely held among those religions). What about Chinese bans on pictures of the Dali Lama and the Tibetan flag? If we implement this proposal, we do several things:
  1. Create a massive burden where ten million pictures need to be classified in probably hundreds of ways
  2. Create a chilling effect for readers, who will risk being asked "why they didn't filter the image".
  3. Create a chilling effect for contributors who wonder whether their image will be censored.
  4. Create problems for service providers (schools, libraries, cafes, ISPs, universities) who will not know what they should be doing (especially in the US where libraries are both constrained to and prohibited from applying software filtering)
  5. Create potential legal liability for ourselves when the filters fail (as they will, inevitably)
  6. Create the inevitable scenario where (as has already happened) we are asked to make "filter on" the default - and will have trouble refusing.
This is an excellent idea, unfortunately thinking it through reveals what appear to be insurmountable obstacles. Do we really want to provide a filter that excludes all photographs of women? But, in all conscience, can we not do so, if we are providing filters for those with other beliefs?
I have, on considered reflection, voted "0" for the first question in the referendum "It is important to provide this facility..." and I would urge others to do the same.
Rich Farmbrough 02:33 17 August 2011 (GMT).
The main application of a feature for an optional opt-in, voluntary, single-user censorship system is the creation of a central censorship database that can then be used for compulsory, externally-imposed, multi-user censorship. Once that database is created, the political momentum for putting it to other use becomes almost impossible to stop.
Once we get to a situation where all images of nude statues are already flagged for nudity, it's a small matter for a Texas campaign group to argue that taxpayer dollars shouldn't be used to show "known porn" to schoolkids, and to insist that all Texas school and library browsers have a plugin installed that automatically identifies wikipages with flagged content and blocks them.
Some people find the concept of Darwinian evolution disgusting, repugnant and offensive. Do we allow creationists to mark all articles on evolution as "controversial" or "political", and therefore blockable in schools by default once the anti-porn plugins are installed? What if some of the right-wing groups decide that anything to do with the WW2 German death camps is "disputed", "controversial" and "offensive" (on the grounds that its disputed by them), and get those blocked, too? Once a flagged image is used in an article, it's a small matter for a censorship plugin to treat the article itself as flagged, so if a group decides that any photographs of Auschwitz are offensive (including the famous images of the gate), then suddenly all pages that use any of those images are identifiable and blockable, and the holocaust deniers can point out to their kids that that since the images are "officially" flagged as "disputed" by Wikipedia, they they're officially accepted as unreliable. China can flag all pictures of the Dalai Lama as offensive pro-terrorist propaganda, white separatists can flag all images of Martin Luther King and Nelson Mandela, Germany can flag all images of swastikas, and if this was a few decades ago, the then British Government would have wanted to flag all images of IRA members to deny them "the oxygen of publicity". Global warming sceptics can flag all graphs and maps used to argue the case for GW, heck, the people who think that the Moon Landings were faked can eagerly track down all the images and tag those as disputed, or POV, too. Every religious political or hate group will be able to organise partially-censored versions of Wikipedia that eliminate images or pages including images that they can tag as belonging to an opposing viewpoint, and insist that their own communities only use the "blinkered" versions ... with the Wikipedia logo and brandname lending the resulting mangled collection of pages credibility.
Once the infrastructure is built, it will be used, and the Wiki Foundation won't be able to argue that they shouldn't allow other censorship models, because they'll have already have crossed that bridge and argued that these things are a good idea in principle, and worth implementing. They'll have created the precedent. And they'll have moved from being one of the major forces for the freedom of information, to one of the architects of a delusional, divided, blinkered humanity, separated into groups that are each only allowed to know and see what their local community and religious leaders want them to know and see. ErkDemon 21:48, 17 August 2011 (UTC)
Agreed That's a pretty serious consideration. If someone wants to censor something for himself or herself, fine. But mandatory censorship is another issue entirely, and could be problematic if someone is just curious about something and wants to find out more about it. Rickyrab 18:30, 19 August 2011 (UTC)

Thoughts on unauthenticated users, flagging, and categories.

  • Unauthenticated users: I think making the filtering options available for unauthenticated (logged-out) users is desirable, but not necessary. Google makes it possible for unauthenticated users to save their SafeSearch settings in a cookie and Wikimedia could do the same thing. However, for the initial deployment of this filtering feature, I think it would be acceptable to assume that unauthenticated viewers should automatically have controversial content hidden, without the option of disabling the filter. They could view individual filtered images, one by one, by clicking on "show this image" links near the hidden images. If users want to change filter settings, they should authenticate as a wiki user. I've seen other websites that operate this way. In later updates of this feature, the filter settings for unauthenticated users could be enabled, if that seems necessary.
  • Reporting/flagging images: How would this work? Would images earn a flag if enough users flagged the image as such? This flagging should only be available to authenticated users in order to prevent attacks on the service that attempt to cause innocuous content to be hidden. In this case, authenticated users should also have validated email addresses and possibly a confirmation email should be sent to them before accepting their "vote".
  • Categories: I think categories would be nice to have, but not essential. The filter feature could be implemented without categories at first, then added later if it seems necessary. Who will decide what the categories are? For example, seeing the discussion about images of Muhammad (above) makes the idea of a "relgiously offensive" category sound important to some people, but others won't care about it. If the number of categories will be limited, those that strongly disagree with the existence of a category will see that as a waste of the available category slots. If the number of categories is unlimited, the amount of choices may be too much for some users. I'm not arguing one way or the other at this point. I'd like to see the filtering feature implemented, but I think there are too many questions about categories and I'd hate to see that delay the implementation.

--Lance E Sloan 14:32, 19 August 2011 (UTC)

Thanks for your thoughts -- phoebe | talk 15:24, 19 August 2011 (UTC)

How to

Are you mad? That is way to difficult and nerdy. There should be buttons to click, not this elitist stuff. --Werwrwer 15:03, 19 August 2011 (UTC)

Comment (registering)

In my opinion, the feature must be able to be disabled without registering! Many forums only allow "blocked" images to be viewed only if the viewer registered for the forum. --tyw7 19:33, 19 August 2011 (UTC)

This feature is only able to be enabled. It is not "on by default", which seems to be a common misconception. If you are not registered - an anonymous IP - you will be able to enable it if you so desire, but there may be difficult in saving your preferences. It is very important to understand that blocking of images only occurs if you choose to block them. Nothing happens automatically, no matter if you are registered or not.--Jorm (WMF) 22:31, 19 August 2011 (UTC)
Return to "Image filter referendum/Archives/2011-08-22" page.