Talk:Image filter referendum/en/Principles

Matters of principles edit

Quit trying to fix something that isn't broke edit

Wikipedia is one of the most successful sites on the Internet despite not having content restriction features. Lets leave well enough alone and stop tinkering with the open principles that made Wikipedia successful in the first place. Inappropriate images aren't a major issue but the slippery-slope solution presented has the potential to ruin Wikipedia itself. Jason Quinn 15:24, 16 August 2011 (UTC)Reply

Approaching Wikipedia with a "don't fix what ain't broke" attitude will stifle its ability to progress. Yes, censorship is a delicate issue, but as long as it is opt-in, we are relatively far from the dangerous slippery slope, imho. I can't imagine any scenario where people would be driven away from a website that allows them to filter things they might not want to see. Nobody complains about Google's safe search feature; if you don't want a "safe" search then you keep it turned off. B Fizz 18:08, 16 August 2011 (UTC)Reply
The existence of the categorization system used for censorship would -however- be neither opt-in, nor opt-out. It either exists or it does not.
The existence of this system allows for multiple avenues of attack against neutrality on any particular object in our custodianship. To wit, some attacks mentioned on this page: 3rd party usage of the categorization system in ways in which it was not intended. Legal attacks on categorization of certain items (both pro and contra). External entities able to mobilize large groups of people conspiring to place certain items under censorship (or remove them from censorship), for reasons of profit, trolling, or political gain.
We already deal with some of these problems on a regular basis, of course. I'm not entirely sure why we would want to make ourselves more vulnerable, however.
The software side is fairly easy, and is -in itself- indeed quite innocent. The problem is the categorization scheme that is going to be biting us in the rear for years to come.
--Kim Bruning 15:48, 19 August 2011 (UTC)Reply
But Kim, aren't all of these problems absolute non-issues as long as the filter is strictly opt-in? People can use whatever categories there are to filter images as closely along their own preferences as the system allows at any given moment. --87.78.46.49 15:58, 19 August 2011 (UTC)Reply
Not only are "all of these problems absolute non-issues as long as the filter is strictly opt-in"; the existence of the filter itself is not much of an issue by itself, even if it were opt-out. People are looking at the wrong part of the problem.
I think the root problem lies with the underlying data used to feed the filter. Now *that* is where the can of worms is. Start thinking about ways in which such structured data can be attacked (vector 1) or abused (vector 2). Assume an adverse environment (that is to say: attacks and abuse are givens). Now come up with a viable data design. That's actually very hard!
Note that if we use the existing category system (which is not what is currently proposed AFAIK) that system would come under new pressure, due to the stakes involved.
I think censorship is a high-stakes game, and I'm not sure it is possible to merely skirt the edges. --Kim Bruning 13:05, 20 August 2011 (UTC)Reply
@B Fizz. Wikipedia has progressed just fine with a community driven approach. In fact, it's hard to imagine it having progressed better. When the community identifies a problem, fix it. These "top-down" solutions go against the spirit of the community. As other people have written here, it's only a matter of time before government gets it's filthy little hands on guiding the content of Wikipedia once these filters exist. In fact, schools and libraries will instantly apply these filters if given the choice, which will automatically trump the supposed opt-in nature of the filter. This whole issue is a non-issue anyway. I view hundreds of articles a week via the random page feature, and I never find inappropriate material.... EVER. The only place where images exist that some might view as controversial is on medical articles. Those images belong there if the add to the educational nature of the article. You don't not give important information to a patient just because the patient doesn't want to hear it. If there's another content here that inappropriate, the only way to find it is to go out of your way and look for it. Jason Quinn 20:24, 19 August 2011 (UTC)Reply

24.148.240.176 05:24, 24 August 2011 (UTC) Busy, busy, busy. This is homologous to the four robocalls per weekend one receives from school principals who natter scripts that are available on their schools' web pages and provide bits of information that are old news to all who might care. Technology applied by the dull because it's there. Avert your eyes, readers, in the old-fashioned way you did back in '09!Reply

The Wikipedia I want to contribute to does not have this feature edit

Since I can't vote against this... ...I will add my voice here.

Very simply, the Wikipedia I want to contribute to does not have this feature.

The Wikipedia I want to contribute to does not accept that any image is inherently offensive. It does not get into arguments about how much buttock is required to categorize a picture as sexual nudity. It does not implement religious superstitions in software. It does not accommodate readers who wish to experience a sanitized version of reality.

The Wikipedia I want to contribute to is unashamed to stand defiantly in opposition to any culture or system of belief that teaches that certain things may not be seen, or certain facts not known. It has a culture dedicated to telling the truth as wholly and neutrally as possible, without any bias of commission or of omission. It abhors any suggestion of censorship, or the creation of any mechanism that might someday support censorship.

I know I am not alone in this. I believe that the majority of people who are inspired and challenged to help build a free encyclopedia - on its way to becoming one of the great cultural landmarks of human history - are generally not to be found in favor of building any mechanism for restricting free access to ideas. This simply should not be built because it is a fundamental betrayal of the core values of the project.

Thparkth 15:07, 19 August 2011 (UTC)Reply

+1, it can't be said much better than that, now where is the "No thanks, and I object to this feature being implemented" button in the vote? Thomas Horsten 15:34, 19 August 2011 (UTC)Reply
By my reading, voting 0 on the first question will do the trick, although whether they will listen to any of the answers that they don't like is doubtful. — Internoob (Wikt. | Talk | Cont.) 20:51, 19 August 2011 (UTC)Reply
I agree up until "[Wikipedia] does not accommodate readers who wish to experience a sanitized version of reality." Sanitized reality should not be forced on anyone, but I do feel that if a reader wishes to experience a sanitized reality, they should be able to. Suppose I want to read the wiki entry for "fuck", but I don't want to see images depicting that word. I don't know whether the article has explicit images or not, but I could just turn off sexual images and read the article. I like being in control over what I see or don't see, and I think it fair to give this ability to everyone, as long as no one but me controls what I see. So someone wants to read about Muhammad but feels that depictions of him are blasphemous? He can turn off Muhammad pictures for himself, and leave the rest of us unaffected. Everyone who opposes this feature acts like someone else's preferences will affect my own ability to experience the raw 100% exposed Wikipedia content. It won't. Anyone who wants the entire uncensored content can still get it, and suggesting that this feature will snowball into a forcibly censored Wikipedia is pure paranoia. B Fizz 16:28, 19 August 2011 (UTC)Reply
Readers are not in control of what they see or don't see, on any web site. The content on the page is the product of editorial decisions. Take it or leave it (or hit the "edit" button and change it), but don't assume you have the right to experience it in a different way than the content creators intended. The bowdlerized article you want to see may be missing critical information, and it may be made non-neutral by your self-imposed censorship. Thparkth 16:43, 19 August 2011 (UTC)Reply
So are you saying that Google Safe Search doesn't exist, or that when the user clicks the button to choose a setting, that the setting is controlled by someone other than the user?
Or do you mean that if you don't control 100% of everything on the website, that you aren't in control of anything at all? WhatamIdoing 00:12, 24 August 2011 (UTC)Reply
+1, and how can I make my voice heard to the deaf who don't want to hear the No they deserve?
Better yet, where is the vote!?! I click the link to SPI from here: http://en.wikipedia.org/wiki/Special:SecurePoll/vote/230 and I get, "Welcome to SPI!
SPI is a non-profit organization which was founded to help organizations develop and distribute open hardware and software. We encourage programmers to use any license that allows for the free modification, redistribution and use of software, and hardware developers to distribute documentation that will allow device drivers to be written for their product.
Read more
This site is powered by ikiwiki, using Debian GNU/Linux and Apache.
This new SPI site is still under construction, please refer to the "legacy" SPI web site for any content not yet present here.
Copyright © 2010 Software in the Public Interest, Inc.
License: Creative Commons Attribution-ShareAlike 3.0 Unported
Last edited 2011-07-14"
No actual way to vote. WTF is actually going on here? A proposal to vote on censorship that contravenes and existing board ruling with no way to vote and bad questions/options (apparently) when you get there... something really weird is going on here. JMJimmy 15:50, 19 August 2011 (UTC)Reply
+1 At first I was rather in favor to allow user-requested filtering, even if I don't see the need myself.
But you explain clearly how it's again the core project's values. If a user, group or user or government want this feature, they can implement it themselves as a browser add-on or a country-wide subversion-wall.
--Guillaume42 00:28, 20 August 2011 (UTC)Reply
+1 This "vote" is embarassing and disgraceful. There's aproximately half a dozen questions, but the most essential question of all, namely if we want this misfeature at all, is not even asked. The closest they get is how important the feature is, which is not the same thing at all. (I consider it important - infact I consider it very important that we NOT implement this.) --Eivind 06:38, 22 August 2011 (UTC)Reply
You are right that you are not alone in this. It is certainly one of the reasons I dove into Wikipedia head-first. SJ talk | translate   02:47, 6 September 2011 (UTC)Reply

This is not censorship edit

Reality Check edit

To those of you who are convinced that this proposal is a form of censorship, and more to the point, to those of you who may have been convinced by all the screaming that this proposal is a form of censorship, I propose the following test. I will agree with you if you can satisfy one of the following conditions. If you can…

1. Name me one image that is removed from Wikimedia’s image banks under this proposal.

2. Name one image of those ten million that I won’t be able to see exactly when I want to under this proposal.

3. Name one image of the ten million that will be hidden from my view unless I personally choose to hide it.

You can’t. You can’t because as much as this may feel like censorship to you, as much as you may want it to be censorship, as much as you want to convince us that, despite what the proposal clearly says, it doesn’t mean what it says, but means the opposite – it isn’t censorship. If it were – you’d be able to answer the questions above.

Wikimedia is a very open society – we all get to say what we want. That is good. But, at some point, reason has to at least be invited to the table, even if it’s given nothing to eat. Vote how you will on this proposal. But to be opposed to it because you feel it’s a form of censorship is a sadly wasted vote. 70.49.184.46

This might come to you as a surprise, but the about two hundreds posts calling this censorships do not generally assume that you are the victim of it, which is largely responsible for the fact that nobody is able to answer your questions. Images will not be removed from the image banks, they will just be tagged so that third-party filtering systems will make people (possibly, not you) unable to access them. The system itself might make people (possibly, not you) unable to see some images under the assumption that they are children or students of somebody who just doesn't think they should be seen. Like, say, the students of a Turkish teacher who doesn't like the idea of the Armenian genocide. Or a Chinese teacher who doesn't like Tienanmen Square. Or a Texan parent who doesn't like Charles Darwin. complainer
Your questions do not make sense. You might a well ask "name one fluffy kitten that will die if this proposal passes".
However, there is a risk to Commons, albeit not immediate. And that is that if this proposal passes, the opt-out version will be right on its tail. Following that, the scenario is that the projects have "admitted" that certain images are "unsuitable" or at least "not needed" then it makes little sense to keep them, since they are A. controversial B. have no educational value (and are hence out of scope). Rich Farmbrough 03:09 20 August 2011 (GMT).
The claims of third-party filtering systems abusing this are silly. The "tags" this system will use already exist; they're called "categories", and you can read about them at Help:Categories. If a third-party system wanted to use our long-existing category system to filter images, then they could have done that many years ago, and they will be able to do that for many years to come, no matter what WMF decides about this tool. WhatamIdoing 00:17, 24 August 2011 (UTC)Reply
They could filter commons:Category:Nudes in art, but the scope of that is not "all artworks containing any nudity whatsoever", nor should it be. I don't think we have a category with that scope, because it isn't very useful as a finding aid, compared to a selection of works where the nudity is actually a significant aspect. If we implement a nudity filter, we would need to create that new grouping.--Trystan 00:45, 24 August 2011 (UTC)Reply
"If we implement a nudity filter, we would need to create that new grouping." Says who?
If we decide to implement a nudity filter, we might well decide that Cat:Nudes in art should be excluded entirely from it. Nobody is guaranteeing that a filter primarily intended to hide, e.g., Commons:Category:Full nudity or Commons:Category:Sexual acts, will necessarily hide marble statues or drawings. WhatamIdoing 17:27, 24 August 2011 (UTC)Reply
Dear 70.49.184.46, you are technically correct The best kind of correct!. Fortunately, ALA comes to our rescue by calling it a "Censorship tool" instead. Not the clicky buttons btw. This is about the categories you would need to create or curate. The clicky buttons were only ever icing on the cake anyway.
Once the tool exists, any (third) party can come along write some of their own (more evil) clicky buttons, and use them to censor wikipedia.
Of course, to many: "censorship", "censorship tool" is like "tomahto", "tomayto". But still; thanks for pointing out the difference! :-)
--Kim Bruning 19:41, 24 August 2011 (UTC)Reply

This is proving an excellent intelligence test, if nothing else edit

Hint: Watch everyone shrieking about censorship. Those are the people that have failed the test. Opt-in thing =/= censorship. 75.72.194.152 02:16, 21 August 2011 (UTC)Reply

Hint: Read all the arguments against the image filter that are not related to claims of censorship. censorship =/= one of the many different very real and valid reasons that speak against implementing an image filter. --87.78.45.196 02:18, 21 August 2011 (UTC)Reply
Images are an integral part of the articles and the wikipedia project. They can be as important as the text. If someone proposed a text filter they would be laughed out of cyberspace and yet with images, the board can pull of this farce of a referendum (which is it not but an afirmation of policy) and say its not censorship at all let alone a dangerous beginning. Try to at least listen to others and read their arguments. These arguments should be CLEARLY visible to voters before the vote (which it is not). --94.225.163.214 02:52, 21 August 2011 (UTC)Reply
Images are an integral part of the articles and the wikipedia project -- Exactly right, and we as a community should trust our own judgment in determining what needs to be included in articles in order to provide the best possible coverage of the subject matter. If an image is deemed to possess explanatory value that goes above and beyond the devices of prose, then that image should be included and it becomes an integral part of our coverage of the subject matter. If there are valid reasons that speak for an exclusion (such as in the case of spider photos in arachnophobia articles), then that image should not be included. Simple as that. If you don't like water, don't go swimming. If you don't actually seek any level of holistic education, don't use an encyclopedia. --87.78.45.196 04:14, 21 August 2011 (UTC)Reply

You are technically correct. Opt-in thing != censorship. However, the proposed categorization scheme used to make it work is *in itself* something that the ALA would classify as a "censorship tool". --Kim Bruning 19:43, 24 August 2011 (UTC) Hope you don't mind me using the programmers "not equals"! ;-)Reply

About time edit

It was about time that this proposal was put forward, and I thank those involved in making sure that it has progressed as far as it has. This is very minimum in putting forth protections that are required both legally and ethically, and it is rather silly that people would oppose it. Ottava Rima (talk) 23:01, 19 August 2011 (UTC)Reply

I respect your opinion, please respect other's opinion and don't them as silly
Concerning the legal aspect, some legal and perfectly acceptable images can be illegal in other countries where censorship is the law. I hope this function will only respond to the users's choice, not to the wish to please some state that want to censor wikipedia.
--Guillaume42 23:47, 19 August 2011 (UTC)Reply
To deny someone an option to opt out can be nothing but silly. You claim censorship because people want the ability to ignore you? That is ridiculous. You do not have the right to force other people to view material that is inappropriate for them. To claim that this is "censorship" is really inappropriate and incivil. Censorship would be blocking people who put the material there. Ottava Rima (talk) 00:57, 20 August 2011 (UTC)Reply
Not implementing this filter does not amount to 'denying' anyone the ability to simply look the other way (as you seem to insinuate). Are you claiming that as wikipedia stands right now, it's denying you the ability to ignore something it contains becuase it doesn't offer a filter? We all already have that feature right now. Simply navigate to another page, or turn off your computer. Filtering and tagging content in this way is completely outside the scope of Wikipedia's concern. There are already countless 3rd party options for denying oneself the privilege of seeing the world as it really is, if said person wants to limit themselves in that way. Pothed 03:15, 20 August 2011 (UTC)Reply
I wasn't claiming censorship on you or supporters of this referendum. I just wish this can be put in place so it respect every user choice, but not easing, even involuntarily, state censorship that would want to impose it's view to it's citizen. My last message may have been ambiguous, sorry.
Ignoring can be done even if something is visible. But the question is the ability to hide. It can be done third party. For instance parental control in the browser can block keywords or URLs (images have their URL) to protect from violent or sexually explicit contents. It could also be done for everyone wanting to hide any category of image. Not implementing it on wikipedia don't mean I'll be impossible.
--Guillaume42 10:22, 20 August 2011 (UTC)Reply
"anyone the ability to simply look the other way " That could only be true if there already was a filter. There isn't. Ottava Rima (talk) 02:47, 21 August 2011 (UTC)Reply
Could you possibly stop calling people who oppose your point of view (which, incidentally, seem to be a vast majority of wikipedia editors) names, as well as, possibly, read their arguments about how this proposal dramatically eases third-party censorship? You being perhaps the most fervent defender of this "proposal", it would really help rationalize and civilize the debate. Thank you in advance, complainer
Actually, the majority of Wikipedians are neither for the pornography nor against a filter of it. Also, Wikipedia is not a right and it is a private entity, so there is no such thing as "censorship" here. Tossing the word around is inflammatory and incivil. Ottava Rima (talk) 02:47, 21 August 2011 (UTC)Reply
This is a non sequitur: censorship has nothing to do with anything being either a right, or a public entity. As for being inflammatory, there are, regretfully, no synonyms to be used, and not using the word because it harms your point wouldn't be fair; it would, in fact, amount to (self) censorship. Complainer
Not even close. Censorship, by definition, implies that you had a right to see something that was removed. You cannot be accused of "censorship" by removing inappropriate things. Otherwise, removing vandalism is censorship. Your arguments are absurd and it is sad that you do not realize it. Ottava Rima (talk) 18:59, 21 August 2011 (UTC)Reply
You know, you'd come out a lot more convincing if you just argued your point without vaguely insulting the other's; if I were to follow in the incivility, I could point at your history on the English wikipedia, which I am not going to do. As for the definition, you might want to consult http://www.merriam-webster.com/dictionary/censorship to see that the term "right" or any of its synonyms is not there. You are also straying, really, off the point, by mentioning "inappropriate things". This discussion is about the implementation of a technical solution to give people the opportunity of not seeing images. By arguing the right of removing inappropriate content, you are actually saying the same thing I, and most (yes, most) other editors are, that is that the solution opens the floodgate of removing content for third parties. The only difference is that, for some reason of moralistic nature, you see it as a good thing. By the way, removing vandalism is not censorship for the plain and and simple reason that everybody is free to see it by looking at previous revisions. Complainer
Convince? You and others came out with guns blazing, attacking people, throwing around insults, and making some of the most problematic comments. You made it clear you aren't here to be "convinced". Furthermore, your claims are really off above. The definition is clear: to claim censorship means that it has to be legitimate and necessary. It is not censorship to not look at something, otherwise the Earth is censoring the moon half the day! If you stop using terms in absurd ways then you would have nothing to say, which is the point. Ottava Rima (talk) 20:06, 21 August 2011 (UTC)Reply
Convince? You and others came out with guns blazing, attacking people, throwing around insults, and making some of the most problematic comments. You made it clear you aren't here to be "convinced".
You're the one who initiated a thread by declaring that any opposition to the idea was "rather silly." Since then, your replies have grown increasingly uncivil and confrontational. —David Levy 20:42, 21 August 2011 (UTC)Reply
Um, the only way you could have a point is if my post was the first post on this talk page. Everyone can see that there were hundreds of comments here before I posted. And yes, trying to say people don't deserve to have the ability to turn off images is silly at its best. It is also xenophobic, racist, or really inappropriate at its worst. And don't toss around "incivil" to me, when you only do so because I pointed out the incivility of your post. You wish to deny people a simple choice because they are not from the same culture or mindset of you, which isn't acceptable behavior. You have to learn to get along with people from other nations if you want to continue editing here. Ottava Rima (talk) 21:16, 21 August 2011 (UTC)Reply
You say that others "aren't here to be convinced," but you initiated this thread by declaring that you had your mind made up and anyone who disagreed with you was "silly."
That's among your milder insults, which also include the terms "bully," "tyranny," "selfish," "fringe," "xenophobic" and "racist" (among others).
Again, I wish to prevent a scenario in which readers are discriminated against because of cultural/personal differences in their beliefs. I don't know what gives you the idea that I don't "get along with people from other nations."
And are you under the impression that my country (the United States) isn't among those in which many people are offended by the content that you've continually cited? —David Levy 00:11, 22 August 2011 (UTC)Reply
You label things as insults that aren't. You made a racist comment. It is that simple. You tried to claim that other cultures were inferior to your special knowledge and that you know better than them. You tried to claim that the minority that agrees with you is superior to the Chinese, Indian, Muslims, etc, all cultures that do not believe that porn is appropriate. If you don't like this being pointed out, then don't say such things. And in the US, most people are offended. We have laws against those under 18 from seeing porn for a reason, and you wish to say that we shouldn't allow these children to obey the law, which is irresponsible. Ottava Rima (talk) 00:33, 22 August 2011 (UTC)Reply
Please directly quote the "racist comment" in which I stated the above.
Does it surprise you to learn that I'm morally opposed to pornography? I am. It's because I don't wish to impose my beliefs on others that I take issue with the idea of formally labeling content "objectionable" in accordance with my (or anyone else's) personal opinion. —David Levy 01:00, 22 August 2011 (UTC)Reply
Here is just one of many racist comments from you: "I believe that it would discriminate against members of various cultures by affirming some beliefs regarding what's "potentially objectionable" and denying others." Basically, what you are saying is that because they are from another culture than yours, you get to decide that they have no right to find anything objectable that you don't. That any of their "objections" must be false, wrong, or undeserved. Your own statement right there is racist. You try to push some sort of ethnic superiority over a large portion of the world. Ottava Rima (talk) 01:23, 22 August 2011 (UTC)Reply
As I've noted repeatedly, you're interpreting those comments to mean the exact opposite of what's intended.
I include some of my beliefs among those that would be affirmed at the expense of others'. I'm saying that I don't want us to deem my (or anyone else's) value judgements correct, thereby discriminating against other cultures by conveying that their contrary objections (or acceptances) are "false, wrong, or undeserved." —David Levy 02:06, 22 August 2011 (UTC)Reply
Sorry, but pretending that I interpret them to be opposite of what they would say would mean that you are arguing for the filter. It is just that simple. Making a statement and pretending that valid responses to that are wrong because "I say so" isn't an appropriate argument. If you think you are being misunderstood, then you should fix what you say. If you are truly claiming that we aren't to value anyone's judgments as "correct", then you would think that the filter must be put in place. Otherwise, you are saying that anyone but you has a bad value judgment. It is that simple. You are discriminating against everyone who isn't you unless you give them the right to filter out what they do not want to see. Ottava Rima (talk) 16:36, 22 August 2011 (UTC)Reply
Again, I support the introduction of a system providing the option to block all images en masse and select individual images for display on a case-by-case basis. This would accommodate all readers' beliefs regarding what is and isn't "objectionable" (instead of attempting to define this ourselves, thereby discriminating against any culture or individual whose standards fall outside our determinations). —David Levy 18:37, 22 August 2011 (UTC)Reply
So, let me understand this, and please tell me a clear "yes" or "no" so I can stop wasting my time with you: you are not trying to convince anybody (I've got to recognize you are not doing so, but I assumed you were at least trying)? Because, if not, there is very little else you would be doing, except for offending. Incidentally, you really, really should follow my dictionary link and read what en:censorship is: the word does not belong to you, and you have no title to attach to it implications that are simply not in its definition. Just to take up one absurd example for the last time, regardless of what the Earth is doing, the moon is a poor subject of censorhip since, when you are not seeing it directly, you can still read about it, and see pictures of it, for example, on wikipedia. You can, as long, of course, as somebody doesn't assume that crescents offend Islamic sensitivities. I'm sort of tickled to know which insults I have thrown around, but not quite enough to keep reading yours: I think being a sorry racist egotist who makes no sense will do for now. Complainer
"the moon is a poor subject of censorhip since, when you are not seeing it directly, you can still read about it" Without the pornographic images you would still be able to read about it. You defeated your own arguments. Ottava Rima (talk) 23:41, 21 August 2011 (UTC)Reply
The following words, conveniently omitted from the above quotation, are "and see pictures of it." —David Levy 00:11, 22 August 2011 (UTC)Reply
Ever wonder why Google or Yahoo have safe search filters? Ottava Rima (talk) 00:57, 20 August 2011 (UTC)Reply
Because people try to game the search engines to get their pr0n visible to as many people (customers) as possible. Nobody is gaming [Special:Search] to promote porn (I hope!) - and it does not return images anyway so the parallel fails. Rich Farmbrough 03:29 20 August 2011 (GMT).
Actually wasnt it not too long ago that Wikipedia was in the news for having excessive amounts of porn (including CP in the form of art)? The safe search filter isn't designed for the purpose described so please go bat your shit elsewhere.. Promethean 06:57, 20 August 2011 (UTC)Reply
Larry Sanger, co-founder of Wikipedia who left Wikipedia after numerous conflicts with other volunteers, & who has been critical of Wikipedia, made that allegation. Fox News picked it up & ran with it -- as if it were President Obama's fault. However, no one has made a study to determine whether he was correct -- or just making unfounded accusations based on spite. (Jimmy Wales did make a unilateral & ill-informed decision to delete some images, but Wales has admitted that he believes Fox News broadcasts the truth -- sad but true.) -- Llywrch 16:37, 20 August 2011 (UTC)Reply
Llywrch, be fair. Agree or disagree with what Larry Sanger charged (and again, I'm not supporting or endorsing anything here), it's clear if one looks at the origins of his action and his attitudes, that he's completely sincere. Also, it's not a case of "make a study" - it's a matter of legal interpretation of obscenity law for one charge, and views on sexual material for another charge. Ironically, neither point, by the way, is addressed by the proposal here. -- Seth Finkelstein 19:07, 20 August 2011 (UTC)Reply
I am being entirely fair. Sanger was the first major example of WikiBurnout, & like many once important contributors has soured on the project & like anyone talking about a former lover with whom one has had a bitter break-up he habitually views it in a negative light. As for "make a study", I should have written something along the lines of "investigated"; but that was the best phrase I could think of at the moment. AFAIK, no one has objectively invested his claims of "Commons is full of porn". It's an accusation along the lines of the SCO's Group claim that the Linux kernel was full of plagiarized code: no one honestly knew for sure, & due to lax policies it -- both plagiarized code in the Linux kernel & porn in Commons -- is entirely possible. But a careful audit showed this was not the case for the Linux kernel code base; no one has yet auditted Commons. (BTW, I assumed by "be fair", you referred to my comments about Sanger; re-reading my post above, I see I was far more harsh on Wales. Or is any criticism of the "God-King" a different matter in your eyes?) -- Llywrch 22:15, 20 August 2011 (UTC)Reply
And inversely, many wiki-editors act like besotted lovers, blind to the faults of the paramour and hostile to those who would dare point out flaws. That being said, it's clear from the immediate pre-FBI-report messages Sanger wrote that he believes what he's saying about the various material (again, this is just a statement of his beliefs, not agreeing or disagreeing with him). Regarding his charge about violating US law, that's a complicated matter which requires legal expertise to analyze. Regarding "full of porn", that comes down to porn/erotica/educational-value debates, where I wish you good luck to "objectively invest[igate]" (i.e. not the same thing as where you personally would draw the lines). By the way, I've defended the barbarian-king on some occasions where I felt criticism of him was unjustified, though sadly I never seem to get credit for doing that. -- Seth Finkelstein 23:45, 20 August 2011 (UTC)Reply
Enforcing? There is no forced opt-in. Even if there was, you can always opt-out. You wish to deny people a choice, which in itself is the only true censorship. There mere fact that you have people putting up so many comments here while logged out shows that the arguments against the opt-in are completely illegitimate. Otherwise, they wouldn't need to hide their identity or use multiple IPs to make it seem like they have more support than actually exists. Ottava Rima (talk) 02:47, 21 August 2011 (UTC)Reply
Enforcing? There is no forced opt-in.
Straw man. That isn't the context in which the word "enforcing" was used.
The question pertained to the enforcement of a cultural viewpoint when determining which images to deem "graphic," not a claim that anyone will be forced to enable the filters.
Additionally, while no one will be required to block any images from view, their options of what type(s) of images to filter will be determined based on non-neutral, culturally biased standards of what constitutes "objectionable" material (unless literally everything is assigned a filter category, which is unrealistic). Each and every belief that a particular type of content is "objectionable" effectively will be deemed valid (if included) or invalid (if omitted). That's non-neutral and discriminatory. —David Levy 04:05, 21 August 2011 (UTC)Reply
Labeling my post as a straw man when it clearly was not is incivil. Your statement is really inappropriate, especially when the claims you said don't exist are directly right there for everyone to see. Please do not make up things like that. "determined based on non-neutral, culturally biased standards of what constitutes "objectionable" material " This is really offensive. Are you saying that people don't have a right to block things they don't want to see because the reasons why they want to block things is a cultural difference? Basically, you are saying "they have a different culture from me, therefore their opinion doesn't matter". That is the very definition of racism. Ottava Rima (talk) 13:37, 21 August 2011 (UTC)Reply
The categories shown are so vague as to be culturally dependent. There is no way to get a neutral standard for the category "Sexually Explicit". Evil saltine 14:08, 21 August 2011 (UTC)Reply
Why would we want a "neutral standard"? Everyone gets to see what is filtered and if they find something they want to see they don't have to filter it. It is that simple. Your argument makes no sense. All filtering is subjective and it isn't forced. Ottava Rima (talk) 16:13, 21 August 2011 (UTC)Reply
It goes against the idea that Wikipedia has a neutral viewpoint when we start making subjective judgments about what is or is not offensive. Whether people have the choice to view or not view images we deem "controversial" is irrelevant to that. Evil saltine 21:16, 21 August 2011 (UTC)Reply
You might be interested in knowing that WMF is far more than the English Wikipedia, and not all projects subscribe to the "neutral point of view" that en.wiki does. For examples, Commons has an official policy disclaiming it: Commons is not Wikipedia, and files uploaded here do not necessarily need to comply with the Neutral point of view and No original research requirements imposed by many of the Wikipedia sites. You will find the list of which projects choose to subscribe to that policy here on Meta at Neutral point of view. WhatamIdoing 00:40, 24 August 2011 (UTC)Reply
1. It was a straw man, as it countered an argument that the person to whom you replied didn't make (by attacking a position superficially similar to his/her real one). You applied the "enforcing" quotation to a context materially different from the one in which it was written.
Again, the claim was that the determination of which images to categorize as "graphic" would enforce a cultural viewpoint, not that anyone would be forced to use the resultant filter.
2. Remarkably, you've managed to misinterpret my message to mean exactly the opposite of my actual sentiment.
One of my main concerns is that the categorization system will include images widely regarded as "objectionable" only in certain cultures, effectively conveying an attitude that "they have a different culture from me, and therefore their opinion doesn't matter."
I'm not faulting anyone for his/her beliefs. I'm saying that this feature will have that effect (by accommodating some and not others). —David Levy 17:25, 21 August 2011 (UTC)Reply
If you want to claim it is a straw man then you do not know the definition of a straw man is. I responded to 100% of what he said and then you started rambling about whatever. Bullying people with incivility isn't appropriate and forcing people to look at things against their will is tyranny. You do not own Wikipedia so stop acting like you do. Ottava Rima (talk) 18:46, 21 August 2011 (UTC)Reply
That IP was me, and I was saying what David Levy is saying. Sorry for the misunderstanding.
No one is "forcing" anyone to read Wikipedia, and some of the text is rather disturbing as well. Are we "tyrants" for not dumbing down the reality for users? If someone comes across something inadvertently that disturbs them, just like with other media, the attitude that should be applied is "that's tough, and what were you doing reading about ten-year-old torturer-murderers if you didn't want to know that?" No one is forced to know reality in all of its colours, and when they do, it should be seen as perhaps an unfortunate, but enlightening experience.
Having said that, I am not opposed to this filter on principle, except that it is not Wikimedia's place to make decisions of "decency". The filter will inevitably be too conservative for some cultures, too liberal for other cultures and just plain off-the-mark for others still. Too conservative means that the filter approaches being censorship, and offends people by telling them that bikinis are indecent, for example (and telling them literally: there will be discussions on the topic and people will say this); too liberal means that the filter is offensive in that it tells people that they shouldn't reasonably be offended by a picture, or is just plain ineffective; and off-the-mark, the most probable outcome, is a combination of the two. Do you see how this is a serious encroachment on NPOV? — Internoob (Wikt. | Talk | Cont.) 19:58, 21 August 2011 (UTC)Reply
1. No, you quoted him/her out of context and countered a nonexistent argument. He/she used the word "enforcing," but not in reference to readers being required to filter images.
2. As noted elsewhere on this page, I have no desire to force anyone to view any image, and I support the introduction of an optional image filter, provided that it covers all images (and enables their manual display on a case-by-case basis).
3. I find your allegations of bullying and incivility mildly amusing, given your propensity to assign insulting adjectives to anyone with whom you disagree.
4. You do realize that this plan applies to many projects other than Wikipedia, yes? —David Levy 20:42, 21 August 2011 (UTC)Reply
1. Nope. No matter how you try to twist it, it wont become true. He said "enforcing a cultural POV". There is no such thing if there is no forcing people to adopt the feature. The only way you can "enforce" something is to -force- it upon people. That means no choice. www.dictionary.com 2. I don't believe that is true or you wouldn't bad mouth the majority of people in the world who do not wish to be exposed to such images or are legally not allowed to be. 3. You can claim I assign "insulting" adjectives all you want, but it doesn't make it true. I describe actions. You describe individuals. Then you try to deny a vast majority of the world a right to a choice. 4. Why would that even matter? People who can't legally view porn or would get in trouble for doing so don't need to be only on Wikipedia for that to be a problem. Ottava Rima (talk) 21:21, 21 August 2011 (UTC)Reply
Lets break this down.
  1. If we have a small number of cats (as the proposal suggests) we need to include some potentially offensive categories and exclude others
  2. This choice of what categories to include is not culturally neutral.
That's why this proposal cannot be culturally neutral as it stands.
Further problems exist, if we extend the system to the large number of categories required to be culturally neutral.
  1. We enable oppressive regimes
  2. We create a massive categorisation task
  3. We are giving undertakings that will fail
  4. We are creating a PR nightmare
  5. We open the door to proposals to "opt-out" and to filter text - both have already been made while the survey on opt-out for images is running
And that's juts for starters. Rich Farmbrough 23:06 21 August 2011 (GMT).
By being available to children while having pornography perfectly visible, we created a PR nightmare. Putting up a filter has never created any problems. The WMF seems quite okay with everything and thinks your potential warnings aren't even close to reality. Ottava Rima (talk) 23:42, 21 August 2011 (UTC)Reply
1. Yet again, you've quoted him/her out of context. He/she referred to "enforcing a cultural POV in that someone will have to make a call as to whether something is graphic or not" (emphasis mine), not in that readers will be forced to filter images deemed "graphic." The enforced cultural POV is the formal declaration that category x is "potentially objectionable" and should contain images y and z. You keep noting that readers will choose what categories to filter or not filter, but they won't choose what options are presented to them.
2. I'm bad-mouthing no one. My personal opinions of what types of content people should/shouldn't deem "objectionable" don't even enter into the equation. My argument is based upon a desire to discriminate against no one, which is why I support a blanket image filter option (automatically covering literally every image subject to which anyone objects).
3. I've listed some of your insults above. What insulting terms have I used?
4. You repeatedly referred strictly to "Wikipedia" and "the encyclopedia," so I wanted to make sure that you were fully aware of the plan's scope. That's all. —David Levy 00:11, 22 August 2011 (UTC)Reply
Your attempt at an argument in number 1, while failing to realize that in the English language your critique doesn't make sense, is a problem. You don't suggest any ability to understand that. I pity you. It is also silly to think that the users wont have any choice in the options, when it is clear that there can be additional categories at any time. It was pointed out many times that the images show that the Pokemon picture can be blocked. Why you ignored that is beyond any explanation. You ignore a lot of things, make claims that are patently false, and try to justify the unjustifiable. Ottava Rima (talk) 00:36, 22 August 2011 (UTC)Reply
1. Thanks for pitying me. That's one of the more polite things that you've written.
2. Indeed, categories can be added at any time (and undoubtedly will be with some degree of regularity). But are you suggesting that every "potentially objectionable" type of image will be included (thereby favoring no culture over another)? How do you address the inherent subjectivity in determining what qualifies (and doesn't qualify) as "potentially objectionable"? —David Levy 01:00, 22 August 2011 (UTC)Reply
Filters are supposed to be subjective because the objective standard of allowing all porn no matter what to fill the pages isn't working, and people need the right to chose what best fits them. Anything is better than nothing. Ottava Rima (talk) 01:24, 22 August 2011 (UTC)Reply
Do you oppose the idea of providing an optional filter for all image files (thereby enabling readers to display their desired images on a per-request basis)? —David Levy 02:06, 22 August 2011 (UTC)Reply
Your question makes it obvious that you never bothered to look at the other page, which removes any legitimate reason for you to be posting. It is clear that, in the example of the Pokemon image, that a filter would allow you to minimize any image. Ottava Rima (talk) 16:37, 22 August 2011 (UTC)Reply
That isn't what I'm describing. I'm referring to a system providing the option to automatically block all images en masse, enabling the reader to make case-by-case determinations of which individual images to view. —David Levy 18:37, 22 August 2011 (UTC)Reply
And that IP was me. I genuinely forgot to log in. — Internoob (Wikt. | Talk | Cont.) 04:18, 21 August 2011 (UTC)Reply


This is a partial solution edit

Unlike most other Wikipedians, I am actually in favor of self-censorship. It is a fact that some people are bothered by certain content and it is a good feature to allow them to filter WP content.

However, this proposal is for image filtering only, and I do not see that there is any intent to extend voluntary self-censorship to text filtering. Personally, I would be happy to be able to convert "F**K" words to this nicer form with the asterisks, or even a more fully-asterisked version "***" (a user-chosen string).

Why is one form of filtering considered worthy but not the other?

If there are sufficient arguments in favor of image filtering, I think most should also apply to text filtering. I would like to see both implemented, for consistency and (as the mathematicians call it) elegance. David spector 12:12, 21 August 2011 (UTC)Reply

Couldn't you try to unlearn your learned aversion to the word instead? It's simple, quick and doesn't open up a can of worms. --87.79.214.168 12:27, 21 August 2011 (UTC)Reply
So he must change because you demand that everyone be as libertine in word choice as you? That is really unfair and inappropriate. Ottava Rima (talk) 13:40, 21 August 2011 (UTC)Reply
That's a good point. Why don't we have a filter, click a button and all "Obscene language" is filtered out? I'm sure we can all come to an agreement about which words are too naughty. Evil saltine 21:28, 21 August 2011 (UTC)Reply
Most chat room systems allow you to filter out cuss words. I would be 100% for that. Ottava Rima (talk) 21:51, 21 August 2011 (UTC)Reply
Well no surprise there. Presumably we would also allow filters for blasphemous text that supports the so-called theory of evolution? Or would we remain strictly on words? I can see the configuration screen now:
Please select which of the following words you find offensive
  1. *********
  2. *********
  3. *********
  4. *********
  5. *********
  6. *********
Please note, all words have been starred out to avoid causing offence.
Rich Farmbrough 22:55 21 August 2011 (GMT).
Are you honestly trying to say someone shouldn't be allowed to not see hard core pornography because someone might be against evolution? That is really weird. Ottava Rima (talk) 23:45, 21 August 2011 (UTC)Reply
Ottava Rima, unless you define "pornography", your use of the term is arbitrary and meaningless. Does e.g. file:Human vulva with visible vaginal opening.jpg qualify as "pornography"?
Also, someone shouldn't be allowed to not see -- Who is forcing anybody to use Wikipedia? --213.196.218.6 01:20, 22 August 2011 (UTC)Reply
Why don't we have a filter, click a button and all "Obscene language" is filtered out? Honestly, it's because there are definitely no english words that are anywhere near as upsetting as the most upsetting images. Just, empirically, people don't get as upset by seeing a dirty word. There are "shock images", but no "shock words" in English anymore, at least none as shocking as shock images. If there words that could spontaneously trigger instant nausea in large groups, we probably should have a 'click to view' filter around those words too-- in my experience, no such words exist, but I only speak english and I've only been me. -AlecMeta 18:02, 22 August 2011 (UTC)Reply
It's not so simple: [1] --195.14.220.250 18:09, 22 August 2011 (UTC)Reply
There's one that is extremely offensive to most African=Americans in the U.S. Ileanadu 09:15, 30 August 2011 (UTC)Reply


Fundamentally disagree with the report's findings edit

I do not support the idea that users need to be given tools to protect themselves from certain content. Exposure to information that is shocking, disturbing, and offensive is healthy. It prevents people from living in a bubble where certain parts of the world they don't like don't exist. It does not necessarily persuade people that the content depicted is or should be normative; in fact, it may do the opposite. But pretending that the world is not occasionally violent, sexual, or full of a diversity of religious beliefs is certainly not healthy. I disagree that "respecting the audience" requires the Foundation to facilitate such self-delusion, any more than it requires the hiding or deletion of facts which contradict a given set of mistaken beliefs about the world.

A "delay viewing" filter is not effectual to prevent children from seeing content some people deem harmful to them, and I do not think it should be attempted. Personally, I do not support the idea that content that is appropriate and potentially educational for adults is harmful to children. Shielding children from certain truths about the world only stunts their intellectual and emotional growth, and creates distrust that adults are sharing the whole truth about any given subject.

I do not support limiting the scope of Commons to media which are deemed "educational". Any distinct, legal, freely-licensed media supplied should be collected there. -- Beland 17:21, 20 August 2011 (UTC)Reply

+1 to "Any distinct, legal, freely-licensed media supplied should be collected there."
One of the reasons I'd accept a filter is that I think it will help us move towards a culture of 'wider scope', where all information is totally welcome. There is no such thing as 'non-educational information'.
This filter won't "protect children" and it's clearly not meant to. This is exclusively meant to help non-tech-saavy adults make their own decisions about what they want to see. It's really only good for things like "I'm reading about porn in public so I don't want to show any on my screen" or "I have a horrible fear of spiders and get really upset having to look at them when I didn't expect to". --AlecMeta 18:21, 20 August 2011 (UTC)Reply
Surely pictures of spiders won't be added to the "restricted pictures" category, will they? This is where the whole thing gets kind of ridiculous, when one group of people have to decide what others may or may not find objectionable. 86.179.1.163 20:22, 20 August 2011 (UTC)Reply
Sorry, no, it's probably not literally true. I used the "images of spiders" as an example since it's upsetting but totally apolitical and culturally neutral. I didn't want to single out any single category of images, because we are too culturally diverse to ever rationally reach agreement on what "should" be upsetting to our readers. So if we have only a few filter categories, we'll have use some 'objectively fair' method to pick the categories, and the presence of spiders won't make the list, I assure you. There are things that stress people out far more than spiders, and I shouldn't let that example trivialize the issue.
Now, if we made the 'perfect' shutterer, then yes, literally, people with serious debilitating arachnophobia could come here to collaborate, on-wiki, to build a spider-image shutterer for themselves, if they really really wanted to. But developing that level of customization may be prohibitively expensive. --AlecMeta 21:17, 20 August 2011 (UTC)Reply
If those are the odd sorts of instances that this filter is intending to address, then I completely agree that it's a total waste of resources. Those issues should simply not be the content providers problem, and should be the end users taking steps through 3rd party applications to achieve there goals. Or hey... you could just turn off images in your browser settings (this is even easier for someone who isn't tech savvy than this proposed filter would to figure out). Both of the situations you mentioned are already solved. There is simply NO NEED for the content provider to concern themselves with this task. It's dangerous territory for a site like this to get into, especially when you consider the problem is already solved on the web-browser side of things. More sophisticated filters can be developed at the end-user level. They have no place on the content provider (server) level. Pothed 20:28, 20 August 2011 (UTC)Reply
I agree so deeply that this is the way the web should have worked. Everyone using Firefox, every browser its own platform, every user their own sysadmin. But we never imagined being THIS successful. We're very pervasive-- we're more like TV, now, than a 2000s era website. We're getting readers who are totally not up to doing anything outside of the browser frame, and we're getting LOTS of them. And they just want to know how to stop accidentally flashing inappropriate images at times and places they didn't intend. It's a very small request from an ever-growing number of people, and we can be accommodate the request pretty easily and cheaply, so long as we're careful about it. --76.20.61.69 21:54, 20 August 2011 (UTC)Reply
So basically, since we're like TV, it's ok that we introduce bleeping? I like the analogy though, because it powerfully illustrates the folly of the image filter. Consider that television, like much of popular culture in general, suffers greatly because people who don't like certain types of content have so much traction that most parts of the end product are an utter mess, dumbed down to the least common denominator. An encyclopedic project is not compatible with this kind of catering to the masses. People consult Wikipedia to learn, and learn they shall. Either they'll have to learn to tolerate content they don't like for whatever reason, or they'll have to learn how to block those images on their client end themselves (e.g. using the userscript I mentioned and linked to further above). At any rate, narrow-mindedness, laziness and ineptidtude shouldn't be rewarded.
As to the spider example. I've brought that up myself, but I found that Greasemonkey userscript and it blocks all images on all Wikimedia projects as long as I keep Greasemonkey activated in Firefox (the script replaces each image with a clickable link to reveal the image with a single click if the user so desires). Since activating/disabling Greasemonkey also takes just one single click, I can "safely" browse Wikipedia without any need for a MediaWiki-side image filter. It is that simple. People who don't like something on TV should mute the set or look away or change the channel. People who don't like certain images on Wikipedia have an easy-to-use and failsafe method readymade and freely available to them via third-party software. I'm emphasizing "failsafe" since the efficiency of the existing categories or any future system of tags or categories for purposes of batch-filtering certain types of images is highly questionable. --87.78.45.196 00:11, 21 August 2011 (UTC)Reply
"I do not support the idea that users need to be given tools to protect themselves from certain content." That is a really, really scary statement and I am baffled that you are posting such. Ottava Rima (talk) 02:48, 21 August 2011 (UTC)Reply
"scary"? No offense, but you're not making a strong case by appealing to emotions instead of explaining in plain terms what exactly you find so problematic with that statement which I for one happen to agree with. --87.78.45.196 03:20, 21 August 2011 (UTC)Reply
"you're not making a strong case" Hiding under an IP and using multiple IPs to try and make it seem like more people are opposed to this than actually are is what isn't making a strong case. Quite the opposite. Ottava Rima (talk) 13:34, 21 August 2011 (UTC)Reply
Appeal to emotions, ad hominem, assumption of bad faith; what's next up your sleeve, you master of logic and reasoning? --87.79.214.168 20:43, 21 August 2011 (UTC)Reply
My dear, the only people the description above applies to is the "its censorship!" group that resorts to using the IPs to make it seem like they have more people while being completely nasty when refering to the WMF and to the majority of people in the world that don't have their view. Ottava Rima (talk) 21:22, 21 August 2011 (UTC)Reply
My dear banned Wikipedian, if you actually skimmed over this talk page, you would inevitably see that there are numerous valid objections to implementing an image filter which have nothing to do with claims of "censorship". So the answer to my question is: Strawman. That was the next fallacy up your sleeve. --87.79.214.168 21:26, 21 August 2011 (UTC)Reply
Just because you claim there are valid suggestions does not make it so. Log in if you truly believe in what you are saying. If you can look up to see if I am blocked, then you are an experienced user and have an account. Your post only verifies that the opposition to this is acting inappropriately. Ottava Rima (talk) 21:45, 21 August 2011 (UTC)Reply
Mate, like the Wikipedia community, I'm done talking to you. --87.79.214.168 21:52, 21 August 2011 (UTC)Reply
The WMF is really going downhill. Recently this really stupid poll, and now this. And the worst thing is that people get paid to deliver this kind of crap. Money that was donated to spread free knowledge!! The foundation should put the infrastructure in place, monitor the legality of things, and let the communities do their thing. Btw: spiders should be tagged as offensive, because some people think they are. One should not make these a-priori decisions, that is not neutral. The only neutral way is letting those who are offended tag the material that they feel is offensive. People who are not offended by that same material should not make those decisions for the offendable people and try to guess what they might find objectionable. Zanaq 17:57, 22 August 2011 (UTC)Reply

Commons:Project scope: "Wikimedia Commons is a media file repository making available public domain and freely-licensed educational media content (images, sound and video clips) to all." Emphasis mine and this is policy. – Adrignola talk 03:30, 21 August 2011 (UTC)Reply

all images that contribute to the understanding of a subject are educational. Educational does not mean, suitable for textbooks for young schoolchildren. . Wikipedia makes the assumption that people can learn to use it carefully. Anyone who does not initially realize that articles on subjects that they find disturbing will have content that they find disturbing, will very soon learn. And is anyone really unaware of that to start with? DGG 03:43, 21 August 2011 (UTC)Reply
You guys are agreeing that the filter is a bad idea. Just stating since it wasn't obvious to me at first glance. To me, the filter is basically Wikimedia giving up on the idea that the community is able to (or even supposed to) determine what needs to be included in articles in order to provide the best possible coverage of the subject matter. It's the project giving up on the idea of community consensus. It's the project leadership telling us that we're too dumb to write an encyclopedia, to make a product by which we as a community stand and which we deliver as is, everything in its right place to the best of our judgment. --87.78.45.196 04:00, 21 August 2011 (UTC)Reply
"all images that contribute to the understanding of a subject are educational" That is like saying all laws are the basis of sound governments. If Hitler was able to use schools to help justify the killing of Jews then not everything can be deemed "educational" simply because you teach another. We don't tolerate propaganda or hate, so why would you put forth a statement that you know is utterly too broad and inappropriate? A mere existence of something does not make it appropriate or educational. Ottava Rima (talk) 13:34, 21 August 2011 (UTC)Reply
"Exposure to information that is shocking, disturbing, and offensive is healthy." - Trying to force your subjective view upon others. 68.126.60.76 13:16, 21 August 2011 (UTC)Reply
Being able to use Wikipedia at work without getting fired is healthy. Being forced to be shocked, disturbed, etc, is not. That is called tyranny. Ottava Rima (talk) 13:35, 21 August 2011 (UTC)Reply
If a person has a reason to use Wikipedia at work, why should he be fired because of what it returns? If we have an image hiding feature, maybe he becomes "culpable" for not using it - and who will actually be thinking about that if they have a work related reason to access the site? You might get more people being fired, not less. Better just to stick with our own uncensored perspective. Wnt 17:08, 21 August 2011 (UTC)Reply
Seriously? Are you joking? You are effectively saying that people shouldn't have an ability to use 99% of Wikipedia if they are not supposed to see 1% of it. That is one of the most insane things that someone can promote and is exactly opposite of the mission of Wikipedia to be accessible to everyone. Ottava Rima (talk) 18:56, 21 August 2011 (UTC)Reply
Wikipedia should be accessible to everyone, of course, and we should lobby for that. However, what you seem to prefer is to change Wikipedia in order to satisfy demands of people who would otherwise block it. It is better to be blocked in China than to accept the tiniest bit of control by the Chinese government over Wikipedia's content. If we are uncensored and people block us, it is their loss. If we are censored, everybody loses. Kusma 19:09, 21 August 2011 (UTC)Reply
If you want Wikipedia to be accessible to everyone, the only response is to support this. Anything short of that is directly taking away people's ability to read this encyclopedia. It is NOT better to be blocked because YOU feel the need to force a few graphic pictures on others. That is absolutely inappropriate. Ottava Rima (talk) 20:08, 21 August 2011 (UTC)Reply
I want a complete Wikipedia to be accessible to as many people as possible. And nothing the WMF does gives or takes away people's ability to read this encyclopedia: that is done by the censors that sit in schools, libraries, or governments. Why do you want to collaborate with them and make their work easier? Kusma 20:39, 21 August 2011 (UTC)Reply
Our goal is to make free knowledge available. If some third party tries to block the whole wikipedia, that's their problem: we've made it available, so our job is well done. Nobody is forced to look at the article about the penis, but those that do look should not be surprised that there is a picture of a penis. Muslims are not forced to look at the article about Muhammad, but they should not be surprised that there are historically relevant pictures (probably created by muslims!) there. We should not waste our time with peoples sensibilities, there are way too many of those, and focus on the gathering (so not hiding!) of knowledge. Zanaq 17:46, 22 August 2011 (UTC)Reply
Yes, yes, yes, and no. This shouldn't be our responsibility, this isn't exactly our job, this isn't what we are here to do. Muslims are NOT forced to look at the article, and no, they should NOT be surprised to find images of him there, and when they do, they shouldn't be upset by it because we're not trying to offend them. (and American Muslims, for example, aren't really the issue-- they understand and if not they'll learn about multiculturalism within a generation or less.)
BUT somehow, this responsibility HAS fallen to us. Muslims _do_ look at the article on Muhammad, often one of the first they visit. They are surprised-- literally shocked. They do get really upset-- some sad, some angry, some confused. They _do_ suffer over it-- those emotions are real.
All this should not be, but it is. Some of our readers are suffering unnecessarily, and we can stop that suffering rather effortlessly. We shouldn't have to be the ones to stop it-- their ISP, their society, or somebody else should have stopped it for them. But the world is real, those readers are real, and as crazy as it is, we're the only ones who actually have the power to fix it for them-- they don't know enough about computers to do client-side filtering and they don't yet know enough about multiculturalism to not have negative emotions over the images.
It is INSANE that we here at Wikimedia are the ones deciding issues that could affect world peace. But here we are! This is what happens when you succeed beyond your wildest expectations-- you have to temper your newfound power with responsibility, you have to adjust ever so slightly, so you don't accidentally stomp on whole societies that you never even imagined you'd get to talk to, and here they are reading our articles. --AlecMeta 18:29, 22 August 2011 (UTC)Reply
Wow, I have to give it to you, you actually beat the Maude Flander argument: it isn't even about the children, it's about World peace! Now, that is both amusing and impressive, BUT: there are people out there who want all Muslims dead. There are Muslims who want everybody who is not a Muslim dead. All these lovely people generally don't have access to computers and, if they do, they hardly have access to wikipedia. Yes, wikipedia is an important project for disseminating knowledge but on the global political scale, its influence is a plain zero. And don't tell me I'm stomping: I spent a LONG time in a place where westerners would be killed if the wrong people saw them, and I tell you, there is no internet in those parts. --complainer
I do not accept that responsibility. I am not here to protect people against themselves. Our goal is to build an encyclopedia/dictionary/university/image repository, not bring world peace. We should look at our internal values, not the external world: this is cyberspace. Zanaq 18:38, 22 August 2011 (UTC)Reply

Thank you for putting in two short paragraphs what I needed four for in my vote comments. The proposal is a crutch for naïve, deluded and incapable, and I certainly don't think we should be encouraging these states by equipping them with tools to continue. And especially not in the name of the children, which seems to be the universal replacement for a coherent argument these days. Mathrick 08:11, 28 August 2011 (UTC)Reply

This is not Wikipedia's problem edit

Oppose: This proposal should NOT be implemented.

When I read it I was reminded of the children rhyme:

Sticks and stones may break my bones,
but words will never hurt me.

We in the Wikipedia community are being asked to solve a set of problems, that should be solved by each individual who has created their own problem.

If a person finds an image offensive, it is their subjective evaluation, that makes it so. Since they are making the judgement, they should solve their problem. It is wrong to try and force anyone else to solve their problem, including this set of reference works under wikipedia.org. (IMHO, the best solution is to stop taking offense.)

The criteria Wikipedia has in place for content are sufficient without this proposal.

Please VOTE against this proposal. Lentower 00:30, 22 August 2011 (UTC)Reply

FYI There is no option to vote against this proposal, at this point in time. Actually.. no wait... --Kim Bruning 01:05, 22 August 2011 (UTC)Reply

Hundreds of different culturs, but only one Imagefilter. Is this the imagination of the board of ONE-World? edit

See, every culture got is own coversion of things you can call obnoxious. For example it seems to be a big differnt between the moralization in USA and europe. Much bigger seems the different between europe and some countrys in arabia. But we got always one catagory for all these different countrys. Now my point is, how will you set an catagory who contains alle the conversions in all these different cultures? Or will the board take the easy way, and promotes the american style for all? Or will they orientate themself, for example, at the converssions of saudi arabia, or Sweden? Sorry, but these contentfilter is dumb, and everybody should know that. -- WSC ® 08:24, 22 August 2011 (UTC)Reply

I don't mean to be pushy but, if you oppose (or support, for that matter) the proposal, I'd urge you, and successive editors, to explicitly and boldly (in the typographic sense) state it. en:User:complainer —The preceding unsigned comment was added by 77.233.239.172 (talk) 08:29, 22 August 2011
Excuse me! My english is lousy. Please can you restate. I don't get the point. -- WSC ® 08:39, 22 August 2011 (UTC)Reply
I'd like for you to explicitly write Support or Oppose (or Abstain, if you have no definite opinion) to the proposal. Thanks, en:User:complainer —The preceding unsigned comment was added by 77.233.239.172 (talk) 08:43, 22 August 2011
Well, I'm not abstain, but I'm oppose of all kinds of acting without due consideration. And these Imagefilters seems to be really inconsiderately. If you want an determiation, in these manner I'm Oppose. -- WSC ® 08:52, 22 August 2011 (UTC)Reply
Thank you: I hope you don't mind me editing the typeface to bold: it will help with the headcount. complainer —The preceding unsigned comment was added by 77.233.239.172 (talk) 09:08, 22 August 2011
Never mind! -- WSC ® 09:12, 22 August 2011 (UTC)Reply
There will most certainly not be a "headcount". We run things on a basis of building a consensus by exchanging arguments. If the majority just piles on votes, these will be rightly discarded in the final evaluation. The better arguments must prevail, not the majority. (Also, please sign your comments with four tildes ~~~~. And if you're w:User:Complainer, why not simply sign in?) --78.35.237.218 09:17, 22 August 2011 (UTC)Reply
I thik, thats not true. The board seems to be sure to launch that image filter. If I disagree, don't matter if I've plausibly arguments, they don't care. But Robertmharris whoerver that is, got much more reputation than me. Maybe he is much smarter than me? I don't know. But he ain't got no answer for my cultur argument. -- WSC ® 09:32, 22 August 2011 (UTC)Reply
I would love for you to be right, although I doubt it; however, I would point out that:
  • The fact that an argument has prevailed is traditionally (and procedurally) expressed on wikipedia by headcounting. People are free to change their vote until the closing time, in case a better argument pops up.
  • There have already been arguments about where current consensus lies as far as this page is concerned. Counting through all the edits, some of which come from IP's that could point to the same user is laborious and uncertain. This process should fix that
  • If there is any merit in the Harris Essay, more than 80% of the world population should support the proposal, and this majority should mirror itself on wikipedia. This headcount is a good lithmus test for the veridicity of the essay.
  • I think you should elaborate on the majority votes being discarded, as it sounds dangerously undemocratic; anyhow, the decision on this issue was already taken before it was even announced, and it has been clearly stated that the referendum was just a way of yanking our chain. This headcount is also a statement against this.
Finally, I am en:User:complainer, not User:complainer: somebody is already using this nick on wikimedia. I have explained this above: please don't start a conspiracy theory. If you want me to prove who I am, I can put something on my English user page, but I really think it would be exaggerating.
-- complainer —The preceding unsigned comment was added by 77.233.239.172 (talk) 09:38, 22 August 2011
@77.233.239.172: Relax, I didn't accuse you of anything. Just please sign your comments with four tildes, you know the drill. Your assertions about headcounting are all wrong btw, so wrong in fact that I'll go into detail only if you ask me to. See also w:Wikipedia:Polling is not a substitute for discussion.
@Widescreen: A crowd of people senselessly shouting "oppose" without presenting valid rationales for their opposition will not sway the board. Good arguments hopefully will. Also, this is not a binary decision. Many support one form of an image filter, but are at the same time very strongly opposed to other variants. If (when) the board proceeds and implements an image filter, the exact details of its implementation are very important. --78.35.237.218 09:51, 22 August 2011 (UTC)Reply
Now without irony: The "Harris Report" (sounds like the epochal en:Kinsey Reports) is nonsens. A jounalist, reclaim global moral high grounds and don't even think about other influence. These guy been never part of the community by his own admission. What makes him an authority to decide how an enzyklopedia should spread knowlege? With this Report the Board trys to fool the comunity. The outsourcing of important desisions to be not responsible. Or for exaple the press. Show me one newspaper who use an imagefilter for their own homepage. And remeber, we are no magazin. We are an enzyklopedia. We want to spread knowledge, not hide it. -- WSC ® 09:57, 22 August 2011 (UTC)Reply
You're preaching to the choir, mate. I posted in this section only to set the record straight on the idea of a majority vote. --78.35.237.218 10:05, 22 August 2011 (UTC)Reply
In german WP. there is an poll in prepare. de:Wikipedia:Meinungsbilder/Einführung persönlicher Bildfilter. The aim is to investigate if the german community is agree with this nonsense. I think, this is an better choise but spread arguments. Because the board gives this fella, robertmharris, more authority than any member of Wikimedia projects. Including themself. So you can argue till armagedon. They don't care. -- WSC ® 10:18, 22 August 2011 (UTC)Reply
Uh, yes, I'm going to accept that my arguments are "all wrong" without an explanation. Right. Care to at least explain why you assume only people who Oppose are going to shout? Is this another Silent Majority argument? Anyhow, if I sign with the four tildes, you'll only get my IP address; since this varies every time, it would border on sock-puppeting, something the more argumentative member(s) of the opposite field would be more than happy to accuse me of. I'll keep on signing with my English username and, for the sake of clarity, urge people not to force-stamp my edits (I'll timestamp. though, if you think it helps). --complainer 10:28, 22 August 2011 (UTC)Reply
Hm? What's the point you trying to make? -- WSC ® 10:45, 22 August 2011 (UTC)Reply
I was talking to 78.35.237.218; the indenting makes it hard to see. --complainer 10:55, 22 August 2011 (UTC)Reply

I would rather talk about the cultural contractedness. -- WSC ® 11:23, 22 August 2011 (UTC)Reply

Yes, about that: I think holding a poll on the German wikipedia could play in the hands of those screaming for censorship: it has been asserted above that German and Dutch culture are "libertine" (which, for people born after circa year 1700, seems to mean "inordinately fond of porn") and (allegedly) represent a minority view on the issue. I'd rather like Indian editors, of which there are quite a number, to come forth and express their opinion, which, according to the Harris Essay, would be overwhelmingly pro-filtering. --complainer 11:43, 22 August 2011 (UTC)Reply
The question is, have the board got the right, to overwhelm any community because of explicite american reflections? Precisely because the German and Duch People are more libaral as others. But I doubt that germany is more liberal than other countries. In germany are a lot of people you can call conservative. These people, got compleatly different perceptions than liberal germans have. Maybe the differents are, that the most conservative germans, you would call liberal in Tennessee (or Toronto?). And here we have another problem with this nonsens: What someone want to see, and what not, is an highly individuel question. -- WSC ® 12:12, 22 August 2011 (UTC)Reply
Conservative Germans are called socialists by some Americans. --Bahnmoeller 12:19, 22 August 2011 (UTC)Reply
We are straying very far from the point, here, but, as a rule of thumb, modern Northern and Eastern European or right wings are liberal, which is the same term, and basically the same political philosophy behind the US "left" wing (i.e., basically, the Democrats). The thing is oddly reflected in Danish policy, where the dominant right-wing party is called Venstre (which translates to "left") as a remnant of a political system reminiscent of the current American one. These liberal parties generally support at least some measure of social darwinism, but are not usually lenient towards either censorhip or support for specific religions (which two things usually come together, anyway) --complainer 12:34, 22 August 2011 (UTC)Reply
You're right. Thats another question. But I can tell you the political ideology of the board: simplemindedly. -- WSC ® 12:47, 22 August 2011 (UTC)Reply
WSC, thanks for commenting. I agree 100% that we have to either create a system of 'culturally neutral' tagging or we will evolve one over time. Every culture has its own version of things it cans calls offensive. Every individual has their own version. Consensus can't decide offensiveness. If we don't already know that now, it will become obvious with time. We won't reach consensus on "offensive" because there is no consensus to be reached, only opinions.
The filter will have to do it's best to be multicultural and multilingual, and we need to be able to tell people that we'll build them a fully culturally-neutral filter if they give us x dollars to fund development.
The whole point, for me, of doing this to be culturally neutral and open to non-US cultures. Americans can handle getting their own filter on their own without our help-- it's the people in the global south who most need our help to meet their reading needs. To build a filter that appeals only to a western notions of offense would be far worse than not building one at all.
I think we know that, and if we don't, I have confidence it will become very apparent with time. --AlecMeta 16:33, 22 August 2011 (UTC)Reply
The only workable option for a filter "category" is to filter all images. Consider the problem of labeling images as objectionable or potentially objectionable from the other side: Who is to say that a given image will most definitely not be problematic to anyone, ever? Therefore, we simply cannot ever produce a workable filter which filters only objectionable images and all objectionable images. Not possible.
I would welcome a simple, general image filter, which could be simply and quickly enabled and disabled. Only with such a filter system, people could actually decide for themselves what they want to look at and what they don't. --195.14.220.250 16:43, 22 August 2011 (UTC)Reply
ACK with the ip 195.14.220.350. I think AlecMeta don't really understand my argumentation. See, for example in saudi arabia, it's offensive, to show a womans knee, or even the face of an woman. Thats never conpatible with the beliefs of european people. They want an pitcture of the face of Sue Gardner, to have an idea of who she is. And these differents are too big to equate them with an commons categorie. -- WSC ® 17:44, 22 August 2011 (UTC)Reply
I think I understand and agree with your argument. I don't believe a few commons categories can do this job either. I think it will take unlimited categories to do this right. I'm just open to be proven wrong by experiment if the board wants to try. --AlecMeta 21:12, 22 August 2011 (UTC)Reply
@Alec: this is starting to freak me out. I thought the proposal was, in principle to offer people a way of not seeing images, and, according to me and a number of other pessimists, in practice a means of facilitating third-party censorship. Your comments about a tool for the "Global South" (a bizarre notion, by the way, since there is a lot more moralism in, say, Syria than in, say, Uruguay) seems to indicate that this tool is actually intended censorship was the purpose of the tool right from the start. I also completely disagree with your time argument: time, imho, will just show that something much alike the current wikipedia will rise and take its place instead of the mangled version we (well, you, actually: I'll be off as soon as I see an alternative) will be offering. --complainer
You are right complainer (@en). I think Alec did't understand, that free science is not a pawn of any political purposes of WMF. Because it's not the aim of an enzyclopedia to be twisted by moralic reservation. I think that was the big aim of Jimbo. To start an enzyclopedia who will change the world. Thats not possible when the enzyklopedia is not one. Jimbo is an salesman, not an en:Encyclopédistes. He had an idea of an enzyclopedia but no notation of what that means. Sue is an journalist and so is robertmharris. But an enzycolpedia is scientific not moralic. Think about what the big en:Encyclopédie would have change, when the writers considered the feelings of monarchs? The enzyklopedia shall change the world, by distributing knowledge, not the world shall change the enzyklopedia by distribute moralic reservations. -- WSC ® 18:30, 22 August 2011 (UTC)Reply
I think it's more about respecting the feelings of the people and not the kings. You have the top-down perspective here, Widescreen. Please don't try to disguise your contempt for individual sensitivities as antiauthoritarian. Adornix 19:10, 22 August 2011 (UTC)Reply
Mind if I do it, then? I am in the best position to, being personally full of contempt for "individual sensitivities", but WSC's argument is sound and raises a legitimate concern, namely: is this a first step towards some kind of commercialization of wikipedia? Jimbo has been panhandling at increasing frequencies for a long time, and this software upgrade is going to cost a lot in servers; since it is fundamentally geared towards currying the favour of the poorer parts of the World, it can't bring that much in donations. And while we are at it, a couple more, namely:
Was this proposal even voted by the WMF, or was it imposed on them too? After all, the WMF is elected by the users, and the principle of representative democracy is that one gets screwed by one's elected representative as a punishment for having voted them; it is part of the game, and everybody in the West is used to it (I am not being sarcastic, by the way).
Was the Harris Essay an actual guideline, or an excuse for a decision taken in advance?
--complainer
"contempt for individual sensitivities" -- Oh gawd, here comes the "all liberals are baby-eaters" line of "reasoning". It was merely a question of time, I guess. It's always either that, or the "bleeding-heart liberals" epithet, depending on current mood and water temperature. The only good thing about rightwingers is that they are reliable. --195.14.220.250 19:21, 22 August 2011 (UTC)Reply
Sorry, thats no step forward into commercialization, thats a step forward into stupidity. I meant Jimo is a buissnessman. Not an Philosopher (maybe an NET-philosopher). Jimbo got other aims, and these aims are not only commercial but idealistic. But these aims not necessarily compatible with the aims of an enziclopedia. But you are right Complainer, when you ask the board whose idea it was to start these Filter. And also ask for the influence of Jimbo. Jimbo can't hold criticism. Last year he trys to delate contents with the authority of his funder flag. Now, believe it or not, we have an Contentfilter here. Proposed by robertmharris, an his dougter. A guy who worked with Sue. You can fool some people sometimes, but you can't fool all the people all the time. -- WSC ® 20:48, 22 August 2011 (UTC)Reply
The way we got here is that when Jimbo did his deletions, everybody screamed "No!". So then we spent a year and a half thinking about how to do it AND keep our values. I think they've basically succeeded. As for motivations, Jimbo, Sue and RobertMHarris are all very very different people, with different agendas and different methodologies and different skills. Imagine them as a conspiracy if you want, they all believe they're problem-solvers working in public, not conspirators working in secret. --AlecMeta 21:17, 22 August 2011 (UTC)Reply
This is fascinating, but I really don't see anything even close to an answer to any of my questions, although we have at least two WMF members regularly contributing. Am I missing something? Or, worse, am I hitting something? --complainer 21:39, 22 August 2011 (UTC)Reply
I'm not sure I have any good answers for you myself. I don't think there's any bad motives here. I think the Harris report is just a report-- it has no "authority" beyond its ability to inform, it wasn't meant to. The Harris report sought, in good faith, to find a way for us to "have our cake and eat it too"-- to construct a win-win that would meet the needs of both anti-censorship editors AND anti-shock-image readers. I think it succeeds, maybe it fails, but I'm very confident it was written in good faith, trying to find the middle ground win-win. --AlecMeta 23:19, 22 August 2011 (UTC)Reply
VALUES? What kind of values? Maybe these values are phantasies not beeing criticised by administrative bodies or conservative powers? And what about my values? Why don't you never ask me, what I thik about this idea? What are your values? Moarl reservations and knowledge were never compatible. Do you think all this people here writing an moralic treatise? No, we try to write an enzycolpedia. Belive it or not. What would you think when you are an physician and write an article about the human body, and you know, maybe if the reader is not loged in, he can't see any picture you are collecting, or maybe shot by yourself? What would you think if you write an article about plitical riots, sure that your tabels and pictures are invisible by anyone who wan't be stressed by this issues? Writing an Enzyklopedia is not for fun, you can't bend all the contents as you will.
And did you never think it could be wrong Jimbo deletes the pictures? Why do you think he is right with that? Don't you think it could have reasons from "everybody to scream NO!" I can tell you why they screemed NO! I got to tell you because it seems, you have nerver think about that! They screemed no, because, they don't want to be censored. Amazing, don't you tink? What makes you belive Jimbo is right with the delation so that you are now a proponent of this stupid filter? Are you so much smarter than everybody who screemed NO!? I don't think so. -- WSC ® 21:48, 22 August 2011 (UTC)Reply
I care about your values, I care about your opinion, and I believe the board does too-- they asked for opinions in a giant way.
And did you never think it could be wrong Jimbo deletes the pictures? ... Don't you think it could have reasons from "everybody to scream NO!" I do think it was wrong, I did then, and I screamed as loud as I could for him to stop. I understand exactly why people screamed "NO!"-- I screamed "No!" too. That's why I'm here trying to tell people that this is a new and different approach-- I was so loudly "No" then, I have a certain duty to say "Yes" now that they've met my understanding of 'Wikimedia Values'. --AlecMeta 23:12, 22 August 2011 (UTC)Reply
Oh Alec, now you try to fool me twice. You said: "they asked for opinions in a giant way." But that ain't true and you know that. They only ask Robertmharris and his daughter. And they don't care, they ain't got no clue about Enzyklopedias. And you screemd no too, while Jimbo deletes the pictures? If you are User:Alec on commons, you said nothing sice Mai 2008. Tell me, were do you screem? At Home while watching TV? And yes you are right, this is a new and differnt approach toward censorship. And congratiolations, the wikimedia met your values. But they don't met mine. But let me think about this! Maybe your arguments (whatever they are) are so importent and smart like Robertmharris' are? Because Robertmharris is such a smart fella. -- WSC ® 08:38, 23 August 2011 (UTC)Reply
 
Venus of Willendorf
Hide Content
 
en:Bomis was a dot-com company founded in 1996 by Jimmy Wales and Tim Shell.
Forget Content
Harris wasn't looking for what 'he' wanted, he was looking for something everyone could like. He's a mediator, not an 'expert'. He's proposed the framework for a treaty he thought we'd like, but it's up to us whether to adopt it. No one can force this on us against our consensus-- we proved that last year. What you're seeing is the completely OPPOSITE approach from last year-- massive discussions after discussions after discussions, LOTS of community involvement, and taking things slowly.
And congratiolations, the wikimedia met your values. But they don't met mine. and that's why we needed to call a referendum on this. My values are no better than yours, I have no reason to believe I'm smarter than you. Nobody even cares whether Harris is smart or not, we just care whether he got it right or not. And to find out how everybody feels, you need a global referendum and discussion! There is no other way for us to have even found out your values.
I agree the referendum questions have some issues, but this is our first time doing it, nobody had any idea how wildly successful a survey like this could be. Nor do we have any agreed method for interpreting the results. It's not a conspiracy-- I'm convinced. It's good people trying to find out what their readers & editors want of them, and trying to do their best to balance it all. They make mistakes all the time, and they learn from their mistakes all the time too. --AlecMeta 17:42, 23 August 2011 (UTC)Reply
First of all: Harris dosen't look for anything, everybody could like. He looks for an way to justify the Filter of all cost. And his "Report" is taken by the board to legitimize an Objekt of censoring. When he is an Mediator so you say, he should have to be absoulut impartial. But Sue was his Boss for years at Canadian Broadcasting Corporation. Curious, isn't it? You are fooling the communitys! And so on, an Mediator needs both parties to mediate. But he never ask the germans. And I'm sure he don't ask anybody but the board. F o o l i n g. Do you think alle the Authors of scientific Articles are stupid jackasses?
And you are right. The board took the easy way. And you are also right when you say you take things slowly. Becaus the more offensive way failed. No on from german community requires these filter. And I'm sure most german Authors don't want them now. But that don't matter? Right? Jimbo and the board wants the Filter and we will get them, no matter if we agree.
And my values is: I don't want any filter or other censorship. And the different to your values and the boards values is, I can give you reasons why this filter is nonsens, and why we don't need em. The german newspapers says, the board want to expend into the "south". S.America, Asia. But what will you export? Censorship? I think this goods are not needed there. Will you teach these people they don't have to face the truth if they don't want to?
And that you call referendum is a joke! Ask us, if we want this filter! No, because you know the answer, and you don't care. Because we are all dumb. Right? Just Robertmharris, Jimbo and Sue are smart. You try to fool us like we are shit. And you can't ballance anything. Exept for, you quit your plans and accept, you don't have to equal anything or censor it. Just keep your hands off wikipedia and buy some servers. -- WSC ® 19:33, 23 August 2011 (UTC)Reply
This keeps coming up, so I suppose that the explanation bears repeating: The existence or non-existence of this tool will have no effect on third-party censors (except, I suppose, to possibly, but only very slightly, reduce sales of the lowest-end products).
The means by which this tool would operate already exist and have been usable by third-party censors for years and years. The means by which this tool would operate are called "categories"—the current, existing, long-standing categories, like Commons:Category:Abused animals by type, not some made-up, brand-new, filter-specific categories. If a third-party censor wanted to use categories to filter images, they could have done that years ago, they could be doing that now, and they could do it tomorrow, even if the WMF decided to dump the tool.
Enabling this tool on (some or all) WMF sites will not enable third-party censorship any more than said censorship is already enabled by the mere existence of some attempt at organization at Commons. WhatamIdoing 00:55, 24 August 2011 (UTC)Reply
You ignore the fact, that most categories (such as violence) don't contain only images that show actual violence. There are images of protests against violence, caricatures against violence, monuments dedicated against violence and so on. All relate to violence, but not all show violence. That is correct, since none of this categories was created or meant to do content filtering. It's a guide to find content. This is a huge difference. If we really want filtering based on categories, we will need to introduce new categories outside the current categorization tree. This will in fact enable censors to exclude "objectionable content". This is no pro-argument. This is a contra-argument. (Read the Library Bill of Rights and its subsections, especially Labeling and Rating Systems) --Niabot 01:40, 24 August 2011 (UTC)Reply
@whatIdoing: I think, they don't even fool the communitys, they are fooling themselves. The filter could be used by organisations, churches and other institutions, to start an own censorship. You can't avoid this with technical gimmicks. It's ain't like in the "north", everbody got his own internetaccount. And even the only "personal censor" should be the super-ego. And less than other things, the Wikipedia. -- WSC ® 05:55, 24 August 2011 (UTC)Reply

arbitrary break edit

Niabot, if you have guessed that I would not include the main Cat:Violence, you're right. Most of the images in it, although related to violence, are not images showing either violent actions or the outcome of violence. I'd be far more likely to recommend that an anti-violence filter include Commons:Category:Wounded people or Commons:Category:Abused animals by type—specific categories that already exist and are primarily the sort of thing that squeamish people do not want to see. WhatamIdoing 17:42, 24 August 2011 (UTC)Reply
WSC, since there's nothing new being created here, there's nothing new for organizations, churches or other institutions to start new censorship programs with. We can't prevent that with technical gimmicks because we can't prevent that now. WhatamIdoing 17:42, 24 August 2011 (UTC)Reply
whatamidoing: You are right. We can't prevent that now. But we shouldn't promote that ever. And the software could be abused. And don't you think, it's paradox, an enzyclopedia founded to spread knoledge, designes a filter which can be abuse for censorship? But this is not the only argument against this filter. -- WSC ® 18:44, 24 August 2011 (UTC)Reply
I do not understand how you think this filter could be used for censorship. Unlike the various proposals made by opponents, this filter adds zero information to the images. There are no tags or blacklists in this filter. How can "tick here if you personally want to hide the images that have been listed in Commons:Category:Photographs of sexual intercourse for three years now" result in a system that can be abused for censorship, but "there's nothing to tick here, even though the images are still located in the same category" not result in a system that can be abused for censorship?
What's the specific mechanism? Do you think that if we don't provide the filter, then the would-be censor will somehow not be able to discover that this category exists? Do you think that someone capable of writing software to restrict access to this category the day after the filter is deployed (assuming it ever is) would somehow (magically?) be incapable of writing software that restricted access to the same category the day before? WhatamIdoing 19:21, 24 August 2011 (UTC)Reply
Excuse me, I don't understand your point because of my bad englisch. Please can you restat that. -- WSC ® 19:36, 24 August 2011 (UTC)Reply
That works fine WhatamIdoing, just beware of people hijacking that category to classify "objectionable content" instead! --Kim Bruning 19:46, 24 August 2011 (UTC)Reply
Seems like whatamidoing is not interested to read my answer? -- WSC ® 18:08, 25 August 2011 (UTC)Reply
I read your answer. I have also determined that your native language is German. I am attempting to obtain a good translation for you. WhatamIdoing 20:57, 25 August 2011 (UTC)Reply
Maybe you don't have to translate it, just rephrase it. -- WSC ® 22:06, 25 August 2011 (UTC)Reply
This is a translation of my above comment from 19:21, 24 August 2011:
Ich verstehe nicht, wieso dieser Filter zur Zensur gut ist. Die Bilder erhalten vom Filter genau null neue Information (anders als bei den vielen Gegenvorschlägen). Es gibt hier keine Tags oder Schwarzlisten, nur eine Präferenz. Wie ist es möglich, dass "klicke hier, damit Bilder, die schon seit drei Jahren in Commons:Category:Photographs of sexual intercourse sind, nicht sofort sichtbar sind" irgendwie zur Zensur missbraucht wird, aber dennoch "die Bilder vom Sexualverkehr sind nach wie vor als solche gekennzeichnet, man hat aber nicht individuell die Wahl, sie zu sehen oder nicht" nicht ebensogut missbrauchbar ist?
Wie soll das denn funktionieren? Wird der Zensor etwa nicht entdecken, dass diese Kategorien existieren, wenn es keine Filter-option gibt? Wenn jemand in der Lage ist, nach Einführung des Filters (sollte es je passieren) Software zu schreiben um Zugriff zu einer Image-kategorie einzuschränken, warum soll dies ohne Filter auch nicht möglich sein, wenn doch die wichtige Information (die Kategorien) schon vorher existiert? WhatamIdoing 18:52, 27 August 2011 (UTC)Reply
Hi, whatamidoing: Is this a babelfish-translation? You've better rephrase it. But thanks for the effort. I think I can imagine what you are trying to say. First of all, this filter is manly a tool of personal censorship. If you don't want the hole truth, you don't have to face it. For example: Ain't it educational to show the hole horror of the holocaust? Do you really think, it's enough to discibe how many people beeing murdered? Don't you think it could be important to let some people know how brutal and inhuman the holocaust was? If we, the authors of this enzyklopedia think, it's necesary to show these pictures it's absurd to let everybody decide if he wants to see them. Especially because my old Schoolbook shows some pictures you certainly found "brutal". If a lot of peadagougus found it didactical helpful or not avoidable to show them. So why the board thinks they have to prohibit pictures like that? And this is just one example. We are the authors and we decide in a collectiv process if a picture is necessary or not! If someone disagree he shouldn't read an enzyclopedia. Rather the Bible, the Koran or "Mein Kampf". We also transport a mindset of what is necessary to understand a topic. I don't want to distribute that a female brest or a exaple of mans brutality is not worth to see it.
But you can also use this Filter for public or other censorship. This is determinating where the internet itself is a rare ware. I think about the third world, not everybody has a internet connection. Parties, Churches, Schools and others are able to use this Filter for their own censorship just by prohibit to opt out. People see just the wikipedia some moralizer want them to see. This ain't got nothing to do with the aims of the author. It's evel enouth they can prohibit us with external tools. We cannot support that just one second.
And, of course you add new information to the pictures! You add an category like "explict pictures", or something. You tell the people whats "explict", "offensiv", "brutal" and abouve all, whats not necessary and not worth to see. This is a right the authors of the article should decide.
At least I want to talk about your example, the Commons:Category:Photographs of sexual intercourse. The most picutres aren't in use in articles at all. In fact I found only two pictures ar in use in Wikipedia! One of them in the likely arabian WP, an other in some european and japanese WP. Seems like no one got a problem with that, except for some moralizer like the board. So why you have a problem with that? It seems to me, the board is strived to create an problem, no one else can see. -- WSC ® 10:16, 28 August 2011 (UTC)Reply
No, that is not a machine translation. It was made by a person who speaks both German and English fluently; he was was born, raised, and educated in Germany. He has an abitur: do you? Perhaps your lack of understanding says something about your education.
I am sorry to see that you are very misinformed about the proposal. (Perhaps someone used babelfish to translate the German pages.) Here is a list of some of your errors:
  • Pictures are not added to Wikipedia articles solely because someone believes the picture to be necessary to understand the subject. Perhaps that should be true, but it is false. Many pictures are added solely because the person thinks that pictures are pretty or because they want their own photograph in the article.
  • Wikipedia is not the only WMF project. The filter will be available on Commons itself, where 100% of the images in Commons:Category:Photographs of sexual intercourse are used. "Used in Wikipedia, which is an encyclopedia" is not the same as "used in any WMF project".
  • There will be no new category like "explicit pictures". There will be only a list of existing categories.
  • Because the filter does not create new categories, then it cannot be used for public or other censorship. Nothing changes about public censorship:
    1. Public censorship can be done today, with the existing categories.
    2. The same categories will still exist if the filter is created.
    3. If the filter is turned on, no new method of public censorship will be created.
    4. If the filter is not turned on, then the current method of public censorship continues to exist.
I ask you again: If the censor does not use the existing categories for censorship today, then why would the censor suddenly begin to use the existing categories for censorship tomorrow? Is the censor too stupid to use the existing categories today? Will turning on the filter magically make the censor smart about the existing categories? WhatamIdoing 23:11, 28 August 2011 (UTC)Reply
There will be no new category like "explicit pictures". There will be only a list of existing categories. This, exactly this, would never work. In case of sexual content it works a little bit better then elsewhere. But in many other cases you would block dozens of harmless images (not that i think that any of our pictures would be harmful). Consider something like violence or war. There aren't only images that depict brutal scenes or crimes. You will find memorials, peaceful demonstrations, portraits of innocent persons, etc. Blocking them all would cause needless collateral damage.
Public censorship can be done today, with the existing categories. You can't do that, as i explained already.
The same categories will still exist if the filter is created. I doubt that, because you will need new categories that would need to separate "harmless" from "harmful" images. The categorization System was and is meant to sort our content for faster navigation and to gather images (galleries if you will) for topics. There isn't or shouldn't be any distinction based on controversy. If so, we already do it (partially) wrong.
If the filter is turned on, no new method of public censorship will be created. Since the filter can't be build on top of current categories, you would need to do so.
If the filter is not turned on, then the current method of public censorship continues to exist. Hopefully not. Since Wikipedia and it's projects aren't censored. OK. I have to admit that EN is censored, since a lot of hot headed censors doing their job: Thinking about what others must think. --Niabot 01:02, 29 August 2011 (UTC)Reply
@whatamidoing: If this is a good translation, it must be a awful and naive argumentation. ;o) But seriously: It's a matter of who is censoring. If anyone trys to censor the Wikipedia, we should stuggle on that. Not provide our own censorship, even it's not so worse like others. The five pillars of WP don't contain, we have to be conformistic or have to twist the knowledge to make it anyone comfortable. For real, I think it's a betrayal on the principes of wikipedia. If anyone take our catagorys to censor the wikipedia, we shouldn't do also.
Next is, you are right, not every picture is add to expain a point. But the authors of the article found it helpful or pretty to add it. But the filter don't hide only pretty pictures, but all pictures, listed in some categorys which never being clearly defined. This is a point you, should think about, not jaw arround about my education or my lack of understanding the intend of this filter. The authors write wikipedia and they shouldn't be overvote by any machine and some moralizer put pictures in forbidden catagorys.
"If the filter is turned on, no new method of public censorship will be created." So you say! It could be helpful, if you answer to my argumentation. If you try to spread dogmas, join the catholics. They got 2000 years more experience than the board with that. But I reckon, if you would really answer to me, you would have to realyse, the true meaning of this filter.
"If the filter is not turned on, then the current method of public censorship continues to exist." Yes, thats also true! And thats the reason why we need another kind of censorship, created by the board. Because there's not enought censorship on this planet.
What you say, there's no need of new catagorys, is not true. Niabot replieed that. Maybe you take him seriously? Maybe not. But this point shows how much you have think about this filter.
Yes, on commons you can view all these sinful pictures, and all this violent pictures, showing the truth of human being. So you can start your own world: small cats will joing little childs who playing in the sun. People live togehter (without copulate with each other) and watching wikipedia, with filters, so they don't have to remember we want to distribute knowledge not evangelical moralization. C'm on do that, but don't bother us, while we try to write an enzyklopedia. If someone wants to see a couple having sex, they don't have to watch that at commons. There are millions of sides like this. These sides contain videos, close ups and really strange things. And some harmless pictures, are in use at wikipedia and commons. -- WSC ® 07:18, 29 August 2011 (UTC)Reply

┌─────────────────────────────────┘
Again: no tags, and no new categories. It uses the existing categories, full stop. The report specifically says not to use tags or other special, filter-only labels, because doing that would enable involuntary censorship. (It's in Recommendation 10.) So we're not going to do this: it is not in the proposal.

Niabot, I'm sorry that you, with your limited knowledge of writing code for Mediawiki, have decided that the devs can't possibly do what they have said that they most certainly can do. When it comes to deciding what's technically feasible, I prefer to believe the WMF's highly experienced developers instead of random editors.

However, you are right about one thing: The filter will not be 100% accurate. This is a direct and known consequence of the choice not to use special categories like "Category:Images that are offensive because they show naked humans". According to this proposal, some innocent images will be filtered, and some unwanted images will not be filtered. These errors are expected. The risk of these errors is not going to stop them from using this unfriendly-to-censorship model. The risk of these errors is not going to result in us creating new categories or tags that are ideally suited for involuntary censorship.

How many errors we will see depends entirely on how stupid we are about the categories we list. Listing Commons:Category:Violence will suppress many innocent images—so it should not be listed for filtering. Listing Commons:Category:Abused animals by type will not suppress any innocent images—so it is a reasonable candidate for a violence-oriented filter. WhatamIdoing 21:05, 29 August 2011 (UTC)Reply

WhatamIdoing! See, even there are no new categorys, you have to determinate some which are "evel" or which contains contants who not necessary to be seen. Let us call these categorys "sinful-categorys". And you say that, if you can divine the future. These filter is not runnig at all. So don't tell me things you don't know yet. Some categorys are unusable. Thats for sure. They contains pictures not even the board wants to censor. There will be errors you say. The filter will not run 100% accurate, you say, whatever that means. But thats indifferent? How many errows are portable to allow people to censor others, or censor themself? You think so and thats ok for me.
But did you notice, you don't answer to the fundamental questions of this filter. If he is necessary, or legit, or useful, or compatible with our aims? You know, spread knowledge is our aim, I think. But thats seems to be not important any more. If the categorization will work, or if there will be errors, are just additional questions, who showes how inconsiderately these filter is. That the filter won't work for any culture, I mentioned at the top. But that's no matter to you. It seems to be indifferent to you. We will start this filter, thats for sure, no matter what it costs, or if it's useful. If it's useful to hide some pictures, or hide any picture, don't matter. If this ist censorship? It don't matter! Nothing seems to matter but the moods of the board and theire silly attempt to explain. And they seems to be in the mood for censorship. -- WSC ® 22:43, 29 August 2011 (UTC)Reply
Who cares if there are errors? If there are too many errors, people will stop using the filters. You don't want them to use the filter, so why do you care if they stop because they are frustrated with it? WhatamIdoing 17:37, 30 August 2011 (UTC)Reply
Thats not the question! The question is: is this filter compatible with our aim, distributing knowledge? I told you a lot of points why this filter shouldn't be used. You can look-up them there are all above. But now it's seems to me you won't take a look at them, and you won't answer them. You only answer to some en:Straw man-arguments. (Like: I should be delighted, if this filter won't work without errors because people will not use em, because they will be frustrated. But they won't find any malfunction. They will just see theres a picture missing. What picture that is, they will never find out, if they don't take a look. But they don't want, to take a look, because they turn on this stupididy filter. Wikipedia will help you to remain stupid. Wikipedia will help you to stay close-minded. Wikipedia will help you if you don't want to face the truth. Forget this knowlege-widget everybody's blubber about. Main thing is, your moaralic reservation will not put to the test.) -- WSC ® 18:49, 30 August 2011 (UTC)Reply
Seems like, Whatamidoing is not abel or unwilling to remove my doubts about the fundamentals of this filter. He doesn't answer to the main refutations. The answers of whatiamdoing don't even touch on the basic inconsistencys of the filter. -- WSC ® 10:08, 2 September 2011 (UTC)Reply

Facilitating Censorship by Governments or Other Groups edit

I am somewhat opposed to filtering, but I don't see it as a complete censorship problem as people are deciding for themselves whether to see certain images. I think ignorance, even if self imposed, should not be encouraged, but this value is in conflict with the value of individual choice - the right to decide for oneself. My censorship concerns are also attenuated by the fact that the image is not removed and the users can always reverse their decisions and choose to view the images.

However, I have grave concerns about the ability of groups to use our filtering for their own purposes. We know that there are countries actively censoring internet content for their citizens. I do not want to make it easier for them to do so. Whether and how our self-selected image filtering can be used by others to impose censorship does not seem to have been addressed. I am not a web programmer so I don't know for a fact that a government can piggyback onto an existing filter and turn on the image censoring function for people using servers in their country; it seems logical that it would make it easier for them to censor. If there is a way to do it, I would bet good money that China will find it.

Thus, the main issue that needs to be addressed is whether implementing an image filter will make this kind of censorship, which is not based on individual choice, easier to accomplish. Ileanadu 09:41, 30 August 2011 (UTC)Reply

I think the answer is yes, we will be enabling third-party censorship, and we need to keep that in mind as one of the significant costs of creating the filter. There are two reasons why it is unavoidable.
First, technically, it is very easy to take warning labels we create for a filter and use them in another program. Our openness works against us here.
The second piece of the puzzle is the creation of the labels themselves. The Harris Report and the referendum itself rather uncritically suggest using the existing category system. However, as discussed on the Categories subpage of this discussion, categories for finding and describing something are fundamentally different in scope and application from labels used for filtering. It would be unreasonable to give people a filtering system and then demand and expect that they will not use it to create effective filters.Once created, there is nothing we can do to stop other parties from using them for censorship.--Trystan 13:43, 30 August 2011 (UTC)Reply
Again: we are not creating any new labels. The only labels we'll be using already exist. Recommendation 10 specifically says that new tags are evil because they would enable third-party censorship. If someone can use Commons:Category:Photographs of sexual intercourse (=what we'll actually be using) to censor WMF, then they could do that last year, and they will be able to do that next year, even if we scrap the filter.
Yes, you're right: the existing categories are not well suited for this. But the proposal is to use them anyway, specifically because we deliberately chose an imperfect filter over perfectly designed new tags that would be suited for involuntary censorship.
I don't know why you're having such trouble grasping this: The proposal is to have an imperfect filter with no special handles for censors, rather than a well-designed, censorship-friendly system. The only people seriously considering a censorship-friendly special labeling system are the filter's opponents. The WMF has already said that they're not doing that, even though they know that is the only way to get "perfect" filtering. WhatamIdoing 17:35, 30 August 2011 (UTC)Reply
They will not create new catagorys (whatamIdoing says), they turn existing ones into prohibit ones. Great argument, really. -- WSC ® 18:57, 30 August 2011 (UTC)Reply
I'm not having trouble grasping anything; you and I have just come to different conclusions about whether it is possible to implement a filter using the existing category system without fundamentally altering how categories are applied to images. Giving the community an imperfect filtering system and asking that they do not improve upon it does not strike me as realistic. If a category serves both as a retrieval aid and a filter, it is not possible to require individual users to only use it with the former function in mind, ignoring the latter. Particularly if the filter interface has a built-in function for reporting false negatives, as the referendum proposes.--Trystan 19:03, 30 August 2011 (UTC)Reply

Sex, Violence, and/or Religion question edit

I have no personal issues with viewing "sacred" images, but I can understand how someone else might. At one point, the essay/study on image filtering seems to treat sexual, violent, and religious images as equal categories. When it comes down to the recommendations, however, sacred images seems to have been blithely dismissed because they present different issues. They do not.

If the issue is whether to allow individuals to opt into a system that temporarily hides images that are deeply & personally offensive, one individual can be just as offended by a sacred image as another is by explicit sexual imagery. Once upon a time, a woman's bare ankle was considered erotic. The reason why just seeing an ankle could arouse some people - and shock others - was the rarity of such an image. To a person who is unaccustomed -- and perhaps forbidden by religion -- from seeing sacred images, the sudden appearance of a sacred image can be as shocking as that Viet Nam picture of the kneeling man who is being executed. In some cultures, the sight of a woman's bare breasts would not be taboo, while in other cultures it would be extremely disconcerting to open up an encyclopedia article and see a bare-chested woman. In one case the sight is shocking because of religious principles, while in the other case, it is social mores that tend to make the sight rare & shocking.

Part 3 proposes different treatment for religious images because “a different intellectual process is involved.” It is the same intellectual process, just shaped by different social factors. No image is inherently offensive; it is offensive because the viewer's culture has inculcated the individual with the idea that it is offensive.

One of the questions on the referendum was whether an image filter “should aim to reflect a global or multi-cultural view of what imagery is potentially controversial.” It is possible that an affirmative answer to the question could be perceived as a reason to not censor religious imagery. Religion and culture are often tied together; in fact, religious views are part of what make up a particular culture. Since today there aren't many "cultures" that treat sacred images as a taboo subject, there is no need to censor sacred images goes the argument. This is not just an issue of Muslims barring any depiction of their prophet. Many judeo-christian religions follow a concept of barring "graven images" according to their interpretations of the Ten Commandments, which are found in the Old Testament/Torah. Different religions have different criteria for what constitutes a graven image or "idol" (see [[2]]) and this presents a challenge to the image censor. The same can be said for pornography and violence. Cartoons from Japan can show a woman's breasts or even her naked body, so long as the pubic region is censored. In the west both regions are covered in public depictions; yet still, we have Michelangelo's David. There are levels of pornography as the epithet "hard core" demonstrates.

Let's note again that this is self-censorship, which is temporary & reversible. Censoring sacred images is NOT “about the struggle between the rights of some individuals to define the limits of appropriateness of some images for themselves and others, versus the rights of others to know.” In all cases -- religion, violence and sex -- a group has defined the limits of appropriateness and seeks to impose that limit on its members. In all three cases, the proposed image filter would give the individual the choice to accept or reject that definition.

So, IF you are going to have an image filter, then under the principle of Least Astonishment, there is no reason to treat religious image taboos different than sexual image taboos.

Unless, that filter can be co-opted by a government or other group for its own purposes. [see my prior comments]

Censoring images in all of these categories will require a lot of difficult decisions and, I imagine, a lot of work. Is it worth it? How many people saw naked breasts for the first time in National Geographic? How many people first learned the facts about reproduction from reading their encyclopedias. In 7th grade we were all familiar with certain taboo words, but I didn't know exactly what they meant until I saw them defined in the American Heritage dictionary.

There was no cover that we had to move to see these images or read their text because each publication was fulfilling its mandate. I still think the purpose of any encyclopedia, whether in hard copy or internet, is to provide information to users who are looking for it. I realize that with the proposed image filtering the images/information is still there & merely requires an extra step on the part of some users (who have chosen to be put in this situation by opting in), but I am not convinced that the work that will be required to create, administer and follow such a system is something we want to undertake.

Does the status quo drive people away from Wikipedia/media or just create discomfort? Ileanadu 11:35, 30 August 2011 (UTC)Reply

I think that it is a different thought process, in the sense that it is far less subjective. The rules are formal and verifiable, and the subjects are cleanly identified. The LDS church declares that their temple garments are too sacred to be displayed in public. An object is either a temple garment or it isn't; either the image shows it or it doesn't. With the other types of images that draw complaints, there is always a question about whether a given image is violent enough or sexual enough to really count. There really are no borderline cases in this area.
As for your last question: The WMF has been told by some users that they will not use the projects because of religion-related images. Whether this means that they will not use any of it, or only that they will not read religion-related articles, is more than I know. Perhaps different users take different approaches. I believe that the Arabic Wikipedia has solved the problem of offending their readers by refusing to display any images of the Prophet Muhammad anywhere in ar.wiki, exactly like the German Wikipedia has solved one the problem of offending their German readers by refusing to include the names of convicted German murders after they complete their prison terms anywhere on de.wiki.
On the point of sexualized images, many, many schools block all of Wikipedia over the sex-and-porn images. However, I strongly doubt that this filter will change this, since any child can bypass the filter on any image with a single click. A school presumably would not consider that to be an adequate level of control for young children. WhatamIdoing 20:09, 30 August 2011 (UTC)Reply
Return to "Image filter referendum/en/Principles" page.