Talk:2010 Wikimedia Study of Controversial Content/Archive 2
Questions for Discussion -- General Comments (see Main Page for Complete List)
editI've kicked off with some brief answers below, and will likely reply to further comments - incidentally, I think it may be a good idea to copy the entire question to this page? - I think it'd be clear enough - and would probably make it easier to follow. If your idea was for answers to be posted to the 'main page' and discussion to occur here, then I can happily shuffle things around - I just wasn't sure if that was the case :-) Privatemusings 06:55, 22 July 2010 (UTC)
- I've taken your suggestion and am posting the complete set of questions here. I'm hoping all of the discussion will take place on this page. Robertmharris 11:30, 22 July 2010 (UTC)
Question 1 Wikipedia has put certain policies and procedures in place to deal with special contentious categories of articles, (controversial articles, biographies of living people, e.g.) (see Wikipedia: Controversial articles http://en.wikipedia.org/wiki/Wikipedia%3AGFCA) Do you think Commons could or should institute similar procedures with some classes of images?
editI have no problem with policy emerging for images based on classes (like violence, sexual content, religious imagery etc.) - in fact I think it's probably a good idea. Privatemusings 06:55, 22 July 2010 (UTC)
- Reasonable enough, but we must remember that everyone sees different things as controversial, or would like to stay away from them. For instance, the flat earth society might want to keep away from globes, arachnophobes from spiders and the amish from buttons. This would essentially boil down to everything being categorised as controversial, which undermines the whole point. -mattbuck (Talk) 11:24, 22 July 2010 (UTC)
- Sounds reasonable, but we need to make sure these are very limited (like those that would causes massive resistance from multiple communities, i.e. extreme sexual content, violent material that has no historical importance, etc.)Sadads 12:34, 22 July 2010 (UTC)
- Reasonable in theory as long as things are narrowly and objectively defined. As soon as you get anything subjective, such as categories like "sexual content", "violence" or "religious imagery", you'll just end up with acrimony and edit wars about what should be in them (is a photograph of two men kissing "sexual content"? What about a labelled line drawing of female genitalia? Should a Tom and Jerry-style cartoon, a video of a professional boxing match and a photograph of the aftermath of a suicide bomber all be included in a "violence" category? Is Scientology a religion? Are photographs of a vicar at a church fête religious images?). Should you tag any image or category as "likely to offend" then you'll very quickly end up with eveyr image/category so tagged defeating the point. As for Sadad's comment, that sounds like it's okay to cause massive offence as long as it's only to one community - how does that square with NPOV? Thryduulf (en.wikt,en.wp,commons) 13:22, 22 July 2010 (UTC)
- Sounds reasonable, but we need to make sure these are very limited (like those that would causes massive resistance from multiple communities, i.e. extreme sexual content, violent material that has no historical importance, etc.)Sadads 12:34, 22 July 2010 (UTC)
"Some classes", such as depictions of living people in the sense of en:WP:BLP, sure. But certainly not special procedures for violence, sexual topics, religion, and the like that seems to be the underlying point of this question. Note that Wikipedia doesn't have special procedures for that sort of issue either. It does have special procedures for biographies of living people, because those people might sue for libel or slander or invasion of privacy if we publish untrue rumors. As far as I know, Wikipedia doesn't actually have any "special" procedures for especially-controversial articles besides the standard policies/guidelines (e.g. neutral point of view and verifiability) and normal dispute resolution; controversial articles just tend to come to the attention of more people who try to enforce the standard policies/guidelines more strictly to keep disruption and edit warring to a minimum. IMO, Commons should generally just enforce their scope, without specific consideration of whether any particular group would be offended. Anomie 16:14, 22 July 2010 (UTC)
- I think it is worth addressing these topics individually, even if in most cases the key statement is that "we don't censor this." In particular, though, there may be a lot of subject matter areas where images ought to get some sort of template tag or a special "hot issue" category. - Jmabel 17:04, 22 July 2010 (UTC)
What is appropriate / controversial / objectionable to some one will not be completely acceptable to someone else. There for I think commons needs two things. The first thing is rules based on objective criteria. Examples are "compliant with law in USA state xxx", copyright, and so on. We largely have them in place, though we might consider adding rules against media which stimulate people to violence or discrimination, such as calls for Jihad. NPOV? I feel some doubt when applying this to images. We should be able to host images of for example historic WWII nazi and communist army posters, even though these are obviously not NPOV. The second thing we need I hope to detail further down. TeunSpaans 19:06, 22 July 2010 (UTC)
- Yes. --JN466 01:21, 23 July 2010 (UTC)
- Commons already has. Commons:Commons:Photographs of identifiable people is it's BLP equiv which is about the only thing en gives special status to.Geni 01:34, 23 July 2010 (UTC)
Obviously not. As long as his purpose is to illustrate concepts encyclopaedically relevant, there should be no censorship. Wikisilki 10:52, 23 July 2010 (UTC)
Nobody is forced to view images of commons, and no one is required even on the computer. More explicit images are all over the net. Any censorship is ethically counterproductive in the field of free information.
I feel bad English is a translation of google. Regards. --Ensada 16:13, 23 July 2010 (UTC)
- I don't have any problem with this general idea, which I think we already do. But I don't think this idea can help the controversial content debate much, since 'offensiveness' or 'pornographicness' aren't observer-independent. --Alecmconroy 15:55, 24 July 2010 (UTC)
- Please note that the linked item on Wikipedia is an essay, which is simply a personal opinion. It is not a guideline and it is not being enacted. If it were to be enacted, it would encounter serious practical problems. Even so, the controversy of Wikipedia articles can be measured objectively — to some extent — by looking for "edit warring", RfC's, ANI reports, and so on that spill out from them rather consistently. Commons images only become controversial if you create some category to designate them as such, allowing debates over their censorship - so such a process would define what is controversial, not follow it. Wnt 02:28, 27 July 2010 (UTC)
Yes; policies can help by reducing the load on case-by-case frictional dispute over images, by development of policies covering general principles and recurring situations. It also often provides a calmer and more thought out dialog, helps ensure that future cases match general expectations, and reduces anomalies.
However note that policies on enwiki are very much creatures of spirit and used with commonsense; it's recognized judgment and exceptions have their place and situations do not always fall into neat categories. The same would apply here - for example whether a file falls into some category or has some impact or value. Overall and with thoughtful drafting, yes, and I'm surprised Commons doesn't have communally developed guidance on certain areas already given its size. FT2 (Talk | email) 03:10, 27 July 2010 (UTC)
- Yes. Wikimedia is not censored except for really extreme stuff like pedophilia, but for Wikimedia to be able to work with all the projects out there that are trying to get information out to everyone we need to enable people to put filters appropriate for their project's censorship level. I'm assuming this can be done through either categories or tables of image names, the categories need to be on commons but the tables could be inside or outside wikimedia. If the young earth creationists want to be able filter out images involving sedimentary rock, cepheid variable stars or the tree of life then there is nothing to stop them building their own off project table of images that they don't want used in their own projects. However my preference is for this to be done transparently, and this means on rather than off commons. It also means that we need to enable complex rather than simple filters - human image settings ranging from xxx to hands and feet with many levels in between. But crucially responsibility for assessing articles for a filter needs to lie with those who want it to be applied, not the foundation or the wider community. WereSpielChequers 07:02, 27 July 2010 (UTC)
- Yes, per FT2. Comet Tuttle 17:39, 30 July 2010 (UTC)
Question 2 -- If yes, how would we define which classes and kinds of images were controversial?
editNo need to restrict to 'controversial' - I believe it's perfectly achievable to construct rational, useful policy based on classes of images. I do however believe there may be systemic / structural hurdles in doing so through a wiki 'community'. Privatemusings 06:55, 22 July 2010 (UTC)
- Controversial is subjective, and as I noted on another question, if we define anything which could be controversial as "controversial", then we'd end up with the whole archive so defined. -mattbuck (Talk) 11:43, 22 July 2010 (UTC)
- Need a checklist with logical factual statements describing features of objectionable content, and a template that you could choose yes or no. That could be verified by an admin,Sadads 12:36, 22 July 2010 (UTC)
- See also my above comment. Sadads - such would have to be worded very carefully to avoid all cultural bias and interpretation. I also don't understand what you would use the checklist for - e.g. an image has checkmarks for "photograph", "automatic machine guns", "clothed male" and "posed", is this a controversial image? It could apply equally to a child member of a rebel militia proud to be showing off the rifles he used to kill people, and to the Chief Constable of a police force at a press conference following the end of a gun amnesty. How do you distinguish between them objectively? Thryduulf (en.wikt,en.wp,commons) 13:30, 22 July 2010 (UTC)
- That would require extensive research methinks. Sadads 15:45, 22 July 2010 (UTC)
- See also my above comment. Sadads - such would have to be worded very carefully to avoid all cultural bias and interpretation. I also don't understand what you would use the checklist for - e.g. an image has checkmarks for "photograph", "automatic machine guns", "clothed male" and "posed", is this a controversial image? It could apply equally to a child member of a rebel militia proud to be showing off the rifles he used to kill people, and to the Chief Constable of a police force at a press conference following the end of a gun amnesty. How do you distinguish between them objectively? Thryduulf (en.wikt,en.wp,commons) 13:30, 22 July 2010 (UTC)
- Need a checklist with logical factual statements describing features of objectionable content, and a template that you could choose yes or no. That could be verified by an admin,Sadads 12:36, 22 July 2010 (UTC)
That is exactly the problem with trying to treat controversial content specially: every group has a different definition of what is controversial, and favoring one viewpoint over the rest will result in a large group of dissatisfied people. Anomie 16:16, 22 July 2010 (UTC)
- I would enumerate broadly, but in most cases the policy would be "we don't consider this a special case". - Jmabel 17:05, 22 July 2010 (UTC)
- As Matt says, controversial is subjective. There for any classes should be defined by the public / visitors. If we enable visitors to define their own categories of "objectionable". Some might see a pic of Muhammed as objectionable, some might see a topless woman as objectionable, others might see detailed violence as inappropriate. TeunSpaans 19:07, 22 July 2010 (UTC)
- Rather than making up our own rules, we could do a survey to identify what types of media are subject to age restrictions, or illegal, in parts of the world. --JN466 01:25, 23 July 2010 (UTC)
- err you don't none of wikipedia's special cases for general categories of articles are to do with controversy.Geni 01:35, 23 July 2010 (UTC)
- Can't be done. We have 12 million editors, 375 million readers. An english-speaking minority can never fairly decide what is and is not controversial to that many people spread across the globe. "Controversial" will, in practice, mean "controversial to the people doing the labeling", i.e. controversial to English-speaking admins. The US/UK POV of controversial would, of course, be adhere to, while the Indonesia or Swedish concepts of controversial won't be. --Alecmconroy 16:06, 24 July 2010 (UTC)
- I agree: it's a culture war. You have to decide that some cultures are worth "protecting" and others aren't. Bear in mind that because this would stir up a slow, cumulative hatred against the "protected" cultures, you're not really doing them all that much of a favor. Wnt 02:30, 27 July 2010 (UTC)
Look at the enwiki policy Criteria for speedy deletion. The policy defines a set of the most clearcut cases which all (or almost all) can agree on. It's not long, and from time to time talk page discussion develops or improves it. Words like "blatant, clear, unambiguous" are used to avoid encroachment into legitimate material by people trying to push any envelope.
For this, set up a poll to get ideas and a sense of views of the basic areas which might be involved (sexual imagery? Shock images? Images of poor quality and easy reproducibility? Images of contentious subjects which are already as well or better represented?). These probably fall into two main groups - some media might be "delete first, appeal if desired" and others would be "short (48 hr?) discussion and then delete if they are agreed to fail criteria, appeal if needed". Then a second stage poll to decide specific wording and/or ratification of any with strong support. Something like that maybe. FT2 (Talk | email) 03:16, 27 July 2010 (UTC)
Question 3 -- What policies and procedures might be put in place for those images? How would we treat them differently than other images?
editEssentially 'as required' - I support raising the bar on criteria for inclusion for example to respect personal privacy to a greater degree than the law may require - I feel that we should require the consent of a victim of violence, or participant in sexual content to publish media in which they feature rather than just a narrow consideration of copyright. Basically I support 'descriptive image tagging' - there are tons of ways of doing this intelligently, and without the sky falling in ;-) Privatemusings 06:55, 22 July 2010 (UTC)
- How does NPOV allow us to treat subjects that we personally like or don't like, but which are equally legal, differently? How do we objectively define what images need the consent of the subject and which don't? You suggest the consent of a victim of violence should be required to give consent, which sounds fair enough, but what about when they are dead? What about where it is not certain whether the victim is alive or dead? If the photograph was taken last week it's almost certain they are still alive, if it was taken 100 years ago it's almost certain they're dead, but what about if it was taken 70 years ago? What about recent images where it is not known who the subject or subjects are, e.g. a photograph of people seriously wounded in a market place bombing?
- I think the only policy we should have is "Wikimedia Commons is not censored and [i]does[/i] contains images and other media that may offend you." Thryduulf (en.wikt,en.wp,commons) 13:44, 22 July 2010 (UTC)
- Hear, hear! Trying to treat some images differently from others will lead to endless wars over whether the "special" treatment should be applied. We already seem to have this in regard to images in the grey areas of copyright law, no need to drastically increase that. Anomie 16:20, 22 July 2010 (UTC)
- As remarked above, I think the right solution is a template or category indicating the possible issue. This would readily allow someone else to filter on these criteria, either to seek these images or to avoid them. It is also possible that in certain areas we might want to keep images out of category thumbnail galleries except for specially tagged categories, and might want to set a policy of requiring an extra click-through to see certain images in certain contexts. - Jmabel 17:09, 22 July 2010 (UTC)
- And then you have people fighting over whether various images should or should not be tagged with the special template or category. 65.188.240.254 03:10, 23 July 2010 (UTC)
- As remarked above, I think the right solution is a template or category indicating the possible issue. This would readily allow someone else to filter on these criteria, either to seek these images or to avoid them. It is also possible that in certain areas we might want to keep images out of category thumbnail galleries except for specially tagged categories, and might want to set a policy of requiring an extra click-through to see certain images in certain contexts. - Jmabel 17:09, 22 July 2010 (UTC)
- Hear, hear! Trying to treat some images differently from others will lead to endless wars over whether the "special" treatment should be applied. We already seem to have this in regard to images in the grey areas of copyright law, no need to drastically increase that. Anomie 16:20, 22 July 2010 (UTC)
I think it would be worth considering a policy on limiting the use of controversial images on the Main Pages of Wikimedia projects, perhaps at least excluding images that would be considered "porn" (as opposed to say, anatomical illustrations, etc.). Our Main Pages are our public face and should be welcoming to as wide a variety of readers as possible. This isn't about censorship, but about being tactful and respectful to our potential audience (which is extremely diverse). If someone wants to look up fisting pictures, more power to them, but we shouldn't showcase them on our Main Page, IMO. Kaldari 17:51, 22 July 2010 (UTC)
- Thats somewhat outside commons' juristiction.Geni 01:37, 23 July 2010 (UTC)
- Commons:Photographs of identifiable people appears to be adequate.Geni 01:37, 23 July 2010 (UTC)
- I think the most we can do, while preserving NPOV, is implement a highly-verifiable labeling scheme (i.e. "descriptive labeling") combined with filter suggestions. So long as independent editors can agree on which label goes with images. "Controversial" isn't a valid label, but "full frontal nudity" might be. --Alecmconroy 16:14, 24 July 2010 (UTC)
- The only effective way to censor various images is by a third-party site, such as that already mandated under w:CIPA in the United States. Otherwise you're trusting a group of raters with widely varying opinions to do the tagging, and you'll end up with endless judicial processes to punish those who don't use the right ratings. It won't satisfy anyone. Wnt 02:35, 27 July 2010 (UTC)
See my comment above at 2 for policies and procedures. As for treatment, there are technical tools and social tools. Technically, I think an opt-out should be added to the interface, but only for individual users' selection and content hidden by collapse-box not actual removal. A preferences tab "sensitive images" would provide check-boxes where the user could select categories of image to be collapse-boxed. This would respect our principles and viewers' both for commons viewers and users of other wikis. We could also use software tools to detect and flag apparent sexual content (other software professes to be able to with varying degrees of accuracy) or images with certain tags. We might also tag the actual images in certain categories so that indiscriminate (spiders) or unaware reusers and content-limiting software can pick up our categorization.
In terms of social tools, we have a difficulty - our principles suggest that spidering and free access are positives and hence I would not intervene on those for any image. But perhaps for some controversial images we should not spider or place them in a space they get seen as "commons content", until some kind of brief review takes place? (For example a different and NOINDEXED namespace while discussion is ongoing). The problem there is this encourages censorship in that if every image is queried the balance will be different over "pile-on delete" (human nature). Would we have our (legitimate but controversial) sexual images if they had that process? Would it discourage contributions? That sort of thing. Perhaps sexual or suggestive imagery, and shock images? But what if some places want nazi camp material added to the list? Or disrespectful images of their king? Or line drawing of human anatomy?
So overall I think, the technical solutions work but unsure about other kinds. Everything will have some objector and having a modest barrier to pile-on deletion proposals is not a bad thing for free knowledge. FT2 (Talk | email) 03:37, 27 July 2010 (UTC)
I am in favor of the principle of least astonishment, as I understand it. I don't think a photograph of a person getting his brains blown out belongs in the "Murder" article, for example; and I do think that "click here to view graphic image" links are acceptable, though I know there's a very vocal libertarian contingent that disagrees with me, throwing up shocked hands at this "censorship". Wikipedia should not be a shock site, for many reasons. Comet Tuttle 17:46, 30 July 2010 (UTC)
Question 4 -- Could uploading practices differ for these images than other Commons images?
editMaybe - in truth the uploading experience is rather painful anyways, so I see the issue more as relating to what 'we' decide to do once the media is on wmf servers. Privatemusings 06:55, 22 July 2010 (UTC)
- I don't see how this would work - it would rely on honour and people reading the upload form, which clearly never happens, so people would still need to troll normal uploads for these images, thus removing the whole point of the segregated system. -mattbuck (Talk) 11:22, 22 July 2010 (UTC)
- No. Or at least not until you develop a 100% effective and 100% reliable computer program that can accurately describe all the content of every image prior to upload (I don't foresee this being possible in my lifetime). Otherwise you have to rely on a self-selecting system which requires everybody to read and understand objective and culturally neutral descriptions of what exactly constitutes each category of images (I've never seen any evidence that suggests this is possible). Furthermore it then requires everybody to abide by this every time. Given that well meaning people do make mistakes (click the wrong image to uplaod for example) and there are a lot of people out there who are not well meaning (the existence of the bad image list is more than enough evidence of this), you will still need humans to view every upload as Mattbuck says. Thryduulf (en.wikt,en.wp,commons) 13:52, 22 July 2010 (UTC)
- Probably not workable. I'd rely more on getting rid of "bad" content quickly, as we do for everything else. - Jmabel 17:10, 22 July 2010 (UTC)
I'm not sure wikimedia should take any extra measures when media are uploaded. Respect for privacy, in some cases, as mentioned by privatemusings, I would support. TeunSpaans 19:20, 22 July 2010 (UTC)
- While we can build custom uploads people tend to respond to greater complexity by lying.Geni 01:38, 23 July 2010 (UTC)
- In theory, yes. Controversial types of images could be passed through a queue where they are vetted and approved by an OTRS volunteer before they appear on Commons. --JN466 01:51, 23 July 2010 (UTC)
- How would you determine which types of images are "controversial types"? Assuming you've managed to determine this, how are you going to identify which images are of these types without human review? Thryduulf (en.wikt,en.wp,commons) 12:37, 23 July 2010 (UTC)
- "Controversial" doesn't exist except in the eye of the beholder. So we can't discriminate based on that. However, I'm okay with the idea that we can afford to be a little 'extra careful' about issues of sourcing and copyright in any image we think might pose legal problems. But only within reason-- it can't be a blanket excuse to deter content that causes offense. --Alecmconroy 16:19, 24 July 2010 (UTC)
- This would unreasonably burden the Commons upload process, and what would the benefit be? Wnt 02:38, 27 July 2010 (UTC)
- See my comment at 3. FT2 (Talk | email) 03:38, 27 July 2010 (UTC)
Question 5 -- If we assume that sexual images might be one of the categories included, or even if we don’t, do you think we have adequately defined sexual content historically in Wikimedia policies? (see Commons: Sexual content http://commons.wikimedia.org/wiki/Commons:Sexual_content)
editWell we haven't really tried, have we? - my feeling is that the linked proposal page, whilst representing some attention on these issues, hasn't really (yet?) delivered any outcomes - again, I believe this may only really be a challenge because we've previously tried to discuss definitions, policies, practices, and philosophies all in one go - that's perhaps part of the wiki discussion weakness? Privatemusings 06:55, 22 July 2010 (UTC)
- Not even close! What the linked page details is a general class of images that a self-selecting group of people believe include a subset of the content that falls into one culture's definition of "sexual content" in some contexts on more occasions than they don't. You'd have to try hard to get woolier than that. Thryduulf (en.wikt,en.wp,commons) 13:57, 22 July 2010 (UTC)
Not really. "Certain types of paraphilias" is quite vague, as is whether a depiction is sadistic or masochistic abuse, as is whether an image "prominently" features the pubic area or genitalia. But as of the current version, the instructions under Prohibited content don't actually care whether something actually is sexual content or not: Illegal is illegal regardless, and both commons:Commons:Photographs of identifiable people and commons:Commons:Project scope apply to all media. Anomie 16:31, 22 July 2010 (UTC)
- Undoubtedly we are not there, but I think we are close. If you think of consensus not as unanimity, but as 90% agreement, lopping off the most extreme 5% on either side, I think we have something like consensus. - Jmabel 17:12, 22 July 2010 (UTC)
Not really. People were rather conservative on the linked page. TeunSpaans 19:27, 22 July 2010 (UTC)
- Is the definition accurate? Sadly no, although it's a laudable effort. But it reflects the inescapable cultural biases of its editors, which are, ultimately, arbitrary. Two individuals kissing ISN'T sexual, unless they appeared to be sadomasochists, but in which case it IS sexual. Similarly, urination is, to most of us, a completely nonsexual phenomenon. But if someone appears to be sexually enjoying the sight of said urine, suddenly it's a controversial sexual image? How long will it take someone to accuse w:Piss Christ of promoting urine-fetishism? And god only knows how many other weird taboos and fetishes are out there we haven't even considered yet.
- Meanwhile, homosexuality, controversial in most of the world, illegal in the rest, isn't even singled out for special attention, even though it's the most controversial taboo there is. This too reflects (admirably I'd say) the cultural biases of our Wikipedia users, who have concluded homosexuality shouldn't be disparaged on Wikipedia. But if we're really going to start using cultural taboos to decide content, surely NPOV requires us to considered homosexuality as one of the most taboo and thus, most deletable. In short, while I personally might agree with most of these definitions, they're just NOT consistent with a truly globally-neutral POV. --Alecmconroy 16:34, 24 July 2010 (UTC)
- I've tried to improve Commons:Sexual content, but I should point out that it makes no attempt to be culturally unbiased. The definition is based on U.S. legal distinctions. Fortunately, the proposed policy doesn't do very much more than simply state existing Commons policies - most notably, that content illegal in the U.S. can't be kept here - and give users references to existing tools and policies to deal with such content. It seems to me that Commons already has sufficient policies in place to prohibit the things that it was alleged to support (e.g. child pornography, or pornography collections outside of Wikimedia's educational mission), and this proposal documents this. Wnt 02:43, 27 July 2010 (UTC)
Question 6 -- If not, how might the definitions be extended, or otherwise changed?
editAsk someone sensible, like yourself, to provide a workable definition, then progress discussions to a point where something is resolved. This requires leadership and / or authority :-) Privatemusings 06:55, 22 July 2010 (UTC)
As far as I can see there are two possible options
- Option A - 1. Abandon the principle of NPOV. 2. Pick somebody at random. 3. As them to write their definition. 4. Get them to delete all media that doesn't fit their interpretation of their definition. 5. Get a lawyer look at every other image and delete any images that in their opinion are illegal for Commons to host. 6. Get a different lawyer and repeat step 5. 7. Repeat steps 5 and 6. Block everyone who doesn't agree with any decision made.
- Option B - 1. Declare that Commons is not censored. 2. Tell people who complain that Commons is not censored, and why. 3. Explain to the newspapers that Commons is not censored and why. 4. Repeat as required. Thryduulf (en.wikt,en.wp,commons) 14:07, 22 July 2010 (UTC)
Is there any actual need for a definition? Or can we just impartially apply the tests of legality, commons:Commons:Photographs of identifiable people, and scope that apply to all media? Anomie 16:33, 22 July 2010 (UTC)
- I'm closer to option B, but I don't see this as inherently trickier than the "notability" criterion we already apply. - Jmabel 17:15, 22 July 2010 (UTC)
- well they could be flat out deleted. Otherwise the different projects have varying processes in this area. They do tend to be documented if you really need to know.Geni 01:43, 23 July 2010 (UTC)
- Don't define sexual content, don't define controversial content. Just label the verifiable aspects, and then let individual groups of users do their own flagging of 'controversial images' via third-party filtration. Or, create a new set of projects that don't have to be NPOV/NotCensored, and let them figure out their own standards. Or, don't change anything and watch it continue to succeed. --Alecmconroy 16:46, 24 July 2010 (UTC)
Question 7 -- Commons identifies two goals for itself: supporting other Wikimedia projects, and creating an archive of visual images for use by all. Are these goals compatible with each other? Do they contradict each other?
editDunno - probably not, because the wmf project goals are generally broad and inclusive too - folk will argue that the later behaves synonymously with the former, and I'm not sure they're wrong (or if it matters) Privatemusings 06:55, 22 July 2010 (UTC)
- They are contradictory - goal 1 says we should host unfree images (such as the wikimedia logo) which contradicts goal 2. -mattbuck (Talk) 11:45, 22 July 2010 (UTC)
- That is a good example. Commons should further relax its extreme free-content rules for hosting images. For example, there should not be a problem with images that are free for educational use or that were released for use on wikipedia. /Pieter Kuiper 11:56, 22 July 2010 (UTC)
- I agree with Pieter here. We have the technical means to tag content according to several copyright status. There is no valid reason, apart for political correctness, we should not use this possibility to include more content. Yann 12:31, 22 July 2010 (UTC)
- I agree with Pieter, the larger the repository, the better it will be used. What communities are we reaching out to right now: GLAM and University communities. Who can use the educational use stuff: GLAM and University communities. Might be much better way to get cooperation. Sadads 12:39, 22 July 2010 (UTC)
- I disagree with Pieter - to me, the main goal is a free repository, helping other projects is just a side-effect (god knows they seem to do their best to screw us most of the time). I don't think we should host ANY unfree images, including Wikimedia logos. -mattbuck (Talk) 00:24, 24 July 2010 (UTC)
- agree strongly with matt. Privatemusings 01:27, 24 July 2010 (UTC)
- I agree with Pieter here. We have the technical means to tag content according to several copyright status. There is no valid reason, apart for political correctness, we should not use this possibility to include more content. Yann 12:31, 22 July 2010 (UTC)
- That is a good example. Commons should further relax its extreme free-content rules for hosting images. For example, there should not be a problem with images that are free for educational use or that were released for use on wikipedia. /Pieter Kuiper 11:56, 22 July 2010 (UTC)
As long as Commons is censored in any way (by content and/or by copyright status) neither goal is possible. Without such censorship I cannot see any conflict. Thryduulf (en.wikt,en.wp,commons) 14:10, 22 July 2010 (UTC)
Isn't the second goal "creating an archive of media freely usable by all"? There is a conflict in that some Wikimedia projects allow media that is not freely usable by all (e.g. fair use images), but by and large they are compatible. Anomie 16:36, 22 July 2010 (UTC)
- I see them as compatible but would rephrase:
- Provide a well-indexed common repository of media freely reusable for other Wikimedia projects.
- Provide a well-indexed repository of media conceivably usable for other educational purposes, or educational in its own right.
- - Jmabel 17:20, 22 July 2010 (UTC)
I see two conflicts and some practicle problems:
- Some wikipedias want images that are not free (copyright)
- Commons deletion processes are seperate from those of the other wiki-projects. Only after deletion on commons, references to these images are removed from other projects. They are not informed beforehand
- Commons tends to delete images, even free ones, which they deem non-educational. For other projects, such as schoolbooks, one needs funny pix which do not have any educational value in themselves but which help to create a schoolbook which reads fun. Such images got deleted in the past, not sure about them now. TeunSpaans 19:46, 22 July 2010 (UTC)
TeunSpaans 19:46, 22 July 2010 (UTC)
- Conflicts exist but do not fall within the remit of this study.Geni 01:44, 23 July 2010 (UTC)
- Commons is a repository of Free media, just like the Wikipedias are repositories of Free articles. From its very existance, Commons supports the other Wikimedia projects, just like the other Wikimedia projects support Commons (you can see Wikipedia as a project that provides encyclopedic background to the Free media hosted on Commons, it works both ways). There are no contradiction. Commons is a "service project" exactly as much as all the other projects are.
- The fact that Commons has a strict copyriht policy is an excellent thing which enables it, rather than hinder it. If Commons started to host unfree images, it would become difficult to manage for its contributors, and hopelessing confusing for its users (nobody suggests that Wikipedia should start "quoting in extenso" work in murky copyright earas to better fulfill its role). Commons is a repository of Free media rather than a mess of "get-away-with-it" media, and it should stay that way if we want it to work optimally. Which we want. Rama 08:10, 23 July 2010 (UTC)
The two are contradictory only insofar as deletion is considered essential to forming a "high quality image archive". If commons is to continue to exist as a shard host, it's essential that the only things that get deleted are stuff NOBODY is using. If we can't promise that to our chapters, we should just let them host their own images and let Commons be the image-archive-project. --Alecmconroy 16:50, 24 July 2010 (UTC)
- I think that Commons has very clearly drawn back from the idea of simply housing an unlimited set of free images, with Commons:COM:SCOPE. It may be that this is simply too large a mission for Wikimedia — traditionally, an encyclopedia is a map, not the whole territory. I feel that certain aspects of Commons policy, such as "moral" restrictions in photographs of identifiable people, and "courtesy" prohibition of material that has not entered the public domain in its home country, are very serious flaws that need to be remedied, perhaps even greater in scope than some of the proposals for censorship of sexual content. Wnt 02:48, 27 July 2010 (UTC)
- One goal is broader than the other, but they don't contradict each other, provided commons sticks to freely licensed images - fair use or educational use images do not belong on commons but might be acceptable on certain specific projects. WereSpielChequers 07:23, 27 July 2010 (UTC)
Question 8 -- One of the participants in the recent discussion on a Commons sexual policy noted that as Commons more fully fulfills its mission to be an archive of all freely-usable visual images, the question of relevant educational scope becomes increasingly difficult to apply or at least begins to change? Do you agree? Is this a problem?
editEducational scope has not been competently considered by wiki 'communities' (particularly commons) in my view - it's actually currently synonymous with 'everything' at the mo. - again, a sensible, workable definition would be required if the 'scope' is to be a reality, and not an arbitrary notion. Privatemusings 06:55, 22 July 2010 (UTC)try and give an example of something which could not be 'educational' if you like :-)
- About the one thing which can be out of scope is user-created artwork, and even then that's not hard and fast. -mattbuck (Talk) 11:42, 22 July 2010 (UTC)
- "All freely-usable visual images" and "Only freely-usable visual images with an educational scope" are clearly mutually incompatible except where "educational scope" is defined as "any image that educates, or could be used to educate, one or more people" at which point the definition is identical to "all freely-usable visual images". As soon as you try to limit the scope of "educational" then you must sacrifice NPOV. Even user-created artwork is educational, for it has a subject and could be used to illustrate an encyclopaedia article, a dictionary entry, a phrasebook, a novel, a textbook, a quotation, an academic study, etc. about that subject. Say for example User:Example (a 17 year old from Corsica) uploads a watercolour painting of a dragon. This could be used in any of the above contexts, it could also be used as part of a studies/nooks/articles/etc about paintings of dragons, dragons in contemporary Corsican culture, watercolour painting in Corsica, watercolour painting in the early 21st Century, etc, etc. It is possible that User:Example could in later life become famous or significant in any number of ways that would make them the subject of an encyclopaedia article and/or academic study, for which an early example of their art might be significant. Thryduulf (en.wikt,en.wp,commons) 14:29, 22 July 2010 (UTC)
- I take a broad view of "educational" and don't see this as a problem. In the area most under discussion, I see human sexuality as being as valid a subject of education as any other. I'm going to again mention an institution I've mentioned before: the Foundation for Sex Positive Culture, which is a 501(c)(3) here in Seattle, and which is one more group you might want to talk to. They have a large library on human sexuality and BDSM, which provides a counterexample to a lot of the more dismissive remarks that have been made here about certain material being inherently "without educational potential". - Jmabel 17:25, 22 July 2010 (UTC)
- Strong yes. To me, the ONLY valid reason to delete otherwise-acceptable information is because of our shared resources. As our resources expand, the scope of what is reasonable to host also expands. In ten years, when bytes are cheaper, we'll be able to host an image of every penis on the planet without it denting our servers. There is no true "non-educational information", it's a contradiction in terms. There is, however, "information that is not sufficiently educational to justify hosting it at this time"-- i.e. "we have enough penis pics right now, thanks". --Alecmconroy 16:58, 24 July 2010 (UTC)
- I very largely agree with Alecmconroy, though I should note that finding the desired image is itself a technical challenge, and this may still be a limitation when the sheer number of bytes is no longer an issue. Wnt 02:52, 27 July 2010 (UTC)
- Yes. If the objective of being an archive of all freely-usable visual images is not dropped, the Commons will end up a dumping ground. Its objective has to be restricted to supporting Wikipedias.Bdell555 10:32, 28 July 2010 (UTC)
- As Wikimedia accumulates pictures of bears, truly, no further bear pictures are going to be needed to advance anybody's understanding of bears, at some point. This is not Flickr. Comet Tuttle 17:42, 30 July 2010 (UTC)
- Exactly Comet Turtle. And this is where we run into the issue of "educational value" verses "all free images". I can see us needing thousands of pictures of bears, and perhaps hundreds of thousands (millions?) of pictures of trees (how many tree species are there, and how many different environmental conditions are there to take pictures of those trees in?). But what happens when we start getting pictures that are essentially duplicates, but aren't? Once we have 10 pictures of an elm tree at dawn covered in a lite frost with a pink partially cloudy sky overhead, do we really need more? They're public domain images, but they are devoid of educational content from the perspective of commons because we already have many such images. I realize that we aren't in this position yet, but we will be eventually. Either we rename commons "Flicker v1.5" and become a free image hosting site for everything out there regardless of the usefulness, or we enforce some kind of educational restriction on image uploads.
- As for pictures of a sexual nature (which is what this is all about), the same idea applies. There are valid reasons for have images of naked people, or even sexual acts. But after you have your 500th picture of "doggystyle, German male, Chinese female", you go from "educational demonstration images" to "porn hosting site". I have no ideological objection to commons being a porn hosting site, but commons will never have enough server space to handle every image (or video, heh) ever made. As space increases, so will available content, but with a large multiplier (because as our HDD space increases, so will everyone elses, so they'll be able to create more content, and they'll want to upload more of that content to commons).
- This means that we have to make decisions as a community about what type of images are most likely to add educational value to commons. It's not about "that image isn't educational, so it has to go". Every image is educational. What it's really about is "this naked painting by XXXXXX famous artist is more educational than this naked painting by user chucknorris69". Not everything is equally educational. For practical purposes — if for nothing else, than for usability reasons (if the image you want is buried under a mountain of garbage, then it might as well not exist since you can't find it) — we should host the most educationally relevant images first. Gopher65talk 14:38, 2 August 2010 (UTC)
- Of course, we have three photos of doggy style, two of them virtually identical, and over three hundred photos of Ursus arctos, the brown bear. So making this about pictures of a sexual nature is a gross violation of NPOV, because those are not the categories where we have hordes of redundant pictures.--Prosfilaes 22:52, 29 August 2010 (UTC)
Question 9 -- Should the number of current images in any category be a factor in the decision whether to keep or delete any recently-uploaded image?
editNope Privatemusings 06:55, 22 July 2010 (UTC)
- I generally say no, as otherwise no one would let me upload my thousands of images of trains. However, I do feel that this has somewhat of a place where the image being uploaded is of low quality, and Commons has a {{nopenis}} template for a reason - lots of people upload bad quality photos of genitalia. That being said I'm generally in favour of pruning bad quality images of any sort. -mattbuck (Talk) 11:20, 22 July 2010 (UTC)
Use and quality are a more important factor than the number of available images. Sadads 12:43, 22 July 2010 (UTC)
- No. If we have two images of an identical subject (not just "penises" but "close up left-facing profile photographs of clean, disease-free, average-length flaccid penises of post-pubescent young adult white males with untrimmed, dark public hair and no visible piercings, tattoos or other decoration or modifications"), and one is of significantly poorer quality than the other, then it should be nominated for deletion (not speedy deleted) unless it is under a freer or qualitatively different license than the better quality one (e.g. if the better quality image is gfdl and the poorer one is public domain or cc-by-sa then neihter should be deleted). This should happen whether we have 3 or 3000 other images in the category. Thryduulf (en.wikt,en.wp,commons) 14:41, 22 July 2010 (UTC)
- Where we have many well-made images, we should be more willing to delete poorly-made images. For example (in a presumably uncontroversial area) we have no use for a badly focused snapshot of a person or building where we have good, similar images. I think similar criteria may apply (for example) to human genitalia. If we had three good, similar images of human genitalia, it's hard to imagine the use of an additional, less well-made image. But other first-rate images should continue to be welcome, as should images that differ along lines of race, camera angle, lighting, etc. And I have no problem at all with us enforcing these rules more strictly in sexually-related areas than elsewhere. An indiscriminate set of pictures of the White House is less likely to create any problems than an indiscriminate set of pictures of penises. - Jmabel 17:31, 22 July 2010 (UTC)
- Agree with Jmabel. --JN466 01:21, 23 July 2010 (UTC)
- Where we have many well-made images, we should be more willing to delete poorly-made images. For example (in a presumably uncontroversial area) we have no use for a badly focused snapshot of a person or building where we have good, similar images. I think similar criteria may apply (for example) to human genitalia. If we had three good, similar images of human genitalia, it's hard to imagine the use of an additional, less well-made image. But other first-rate images should continue to be welcome, as should images that differ along lines of race, camera angle, lighting, etc. And I have no problem at all with us enforcing these rules more strictly in sexually-related areas than elsewhere. An indiscriminate set of pictures of the White House is less likely to create any problems than an indiscriminate set of pictures of penises. - Jmabel 17:31, 22 July 2010 (UTC)
- Maybe for now, but not forever, and probably not now either. Usefulness is a far better metric than images-in-category. In general, though, contested deletions, of small files at least, probably isn't a good use of our time. We also shouldn't impose a 'double standard' on controversial images vs noncontroversial images. --Alecmconroy 17:05, 24 July 2010 (UTC)
- I think we do have to recognize that users only see so many images at once in a search, and if there are very many of a given type, the bad can drive out the good. We should be very cautious about deleting potentially useful resources, but in a few cases it isn't altogether unreasonable. Though no doubt there are some academic studies that could benefit from a collection of thousands of penises from all over the world... Wnt 02:59, 27 July 2010 (UTC)
- Yes. Quantity controls encourage increases in quality. If there is only one image in a particular category, its uniqueness argues for keeping it, all else equal.Bdell555 04:02, 28 July 2010 (UTC)
- I think: We should only use these numerical criteria in conjunction with quality criteria. I would be happy with decisions based on quality alone, or on a combination of quality & quantity (ie. "We already have lots of better photos of X; delete it" or "This isn't great quality but we don't have anything else; let's keep it"); but decisions based on quantity alone would make me sad. Bobrayner 14:44, 28 July 2010 (UTC)
Yes. Each additional image makes it harder to find the highest quality images. The sea of poor quality content on YouTube should illustrate the problem amply. Comet Tuttle 17:48, 30 July 2010 (UTC)
- Yes, it is an influential factor. You can't overload a category with three images, so even the smallest distinction is worth keeping. If you have hundreds of images, then it's either you have to start deleting some or subcategorizing.--Prosfilaes 22:58, 29 August 2010 (UTC)
Question 10 -- Images on Commons are presented, by and large, without context, often with just a brief descriptor. Should a note on context and reason for inclusion be part of a regime around controversial images (as it is, for example, on the image of the Jyllands-Posten cartoons depicting images of Muhummad, to explain deviation from normal licensing regime) ?
editDon't really care - leaning towards it being irrelevant if media is responsibly curated. Privatemusings 06:55, 22 July 2010 (UTC)
- Thats a little silly, curation is more important than contextual descriptors. Imagine how much time that would take for volunteers to write contextual rationals for every mildly controversial image. Also, new users: image upload is already complicated, do we want to make it more so? Besides at the bottom of every page, their is a list of pages which use the image on Foundation projects, isn't that good enough for context?Sadads 12:42, 22 July 2010 (UTC)
I don't really understand the question, but if you adopt the (imho) only workable policy of "Commons is not censored." then context or lack thereof does not matter (excluding legal issues, but then if an images requires context then AIUI we couldn't legally host it anyway). Thryduulf (en.wikt,en.wp,commons) 14:44, 22 July 2010 (UTC)
The description pages of all images should provide sufficient context for the image itself that someone unfamiliar can figure out what it is supposed to be. I have no answer to the specific question here, as I strongly disagree with the underlying assumption that there should be a "normal licensing regime" such that any image hosted (besides the non-free Wikimedia logos) would be a deviation. Anomie 16:47, 22 July 2010 (UTC)
- No need for anything beyond the usual description, templates, and categorization approaches. I do like the idea of subcategories for sexual (or even nude) images within a larger category, e.g. that images of nude people cooking don't belong directly in the cooking category. - Jmabel 17:33, 22 July 2010 (UTC)
The upload procedure is already too complicated, we should not make it more difficult and time consuming. But I would favour a possibility for visitors to tag images as objectionable with one or more tags (bestiality, islamic, pornographic, etc) and then have users select a profile (in profile for registeerd users, in cookie for non registered users) for things they dont want to see. That system works reasonable well on youtube and other sites for sexual content, and i no reason why it would not work for other types of content. TeunSpaans 19:55, 22 July 2010 (UTC)
- I think context is vital to this issue (and possibly the only thing that we may be able to build consensus around). For example, although it may be perfectly fine to include an image of nude bondage in the BDSM article, you probably wouldn't want it included in the Rope article. Similarly you wouldn't want to feature an image of Osama Bin Laden on the en.wiki Main Page on 9/11. Right now, we rely solely on the discretion of our editors to make sure images are used in proper context. It would be useful if we actually had a policy we could point to if an editor happened to have a catastrophic loss of discretion. Kaldari 22:31, 22 July 2010 (UTC)
- I am afraid that the example is inadequate: the Jyllands-Posten cartoons do not seem to be hosted on Commons, and as far as I know, they are unlikely to be in the short term since they are protected by copyright. There might be some confusion between controversies that occur on Wikipedia and are widely reported in the mainstream media, and the actual problems that we have on Commons.
- I think that the "by and large, without context, often with just a brief descriptor" should be qualified more precisely -- or rather, that the implicit adequate media description be qualified. For instance, what is lacking with the description of this controversial image of Mohammed Mohameddemons2? Note that in practice, descriptions often contain links to Wikipedia, where the subject or author is discussed at length. Rama 08:30, 23 July 2010 (UTC)
- No descriptions or context necessary whatsoever, nor should it be. Too much bureaucracy discourages contributions. (That said, if anyone wants to go through and write up contexts, why not?). --Alecmconroy 17:10, 24 July 2010 (UTC)
- I think we should strongly encourage descriptions; and if there are a large number of images of the exact same thing, then we can prefer those with better descriptions over worse; or we can use detailed descriptions to say that they aren't the same thing. In other words, a hundred pictures of (generic) penises may be much more than we need; but a picture of one penis from each nation in the world might document more seriously educational variations of human morphology. Likewise, random images of Muhammad copied off the Facebook site may have no obvious educational point to them, but when documented to an artist or a source they become much more desirable. So as a rule, the more information available, the easier it is to justify things under an educational mission. But this should never be necessary with the first image of a given type. Wnt 03:04, 27 July 2010 (UTC)
- Yes. If the inclusion or exclusion of media is likely to be challenged, a rationale should be provided just as rationales are expected for Wikipedia edits when the reason for the edit is not obvious.Bdell555 04:19, 28 July 2010 (UTC)
Strange set of questions
editThis is a strange set of questions. I wonder how much the people you talked with know about Commons. For example, the last question refers to the cartoons published by Jyllandsposten, but en:File:Jyllands-Posten-pg3-article-in-Sept-30-2005-edition-of-KulturWeekend-entitled-Muhammeds-ansigt.png is not on Commons, because it is not a free image. The note is a "non-free use rationale", which enwp requires also for other newspaper pages, for non-free science visualizations or for portraits.
And yes, the copyright restrictions on Commons make it difficult to cater to the needs of the wikipedias. It should be no problem to include publicity images that are free for editorial use in wikipedia articles. Yet Commons regards anything as not sufficiently free when images do not explicitly allow derivatives or commercial use. Is your question 7 really a part of your mandate? /Pieter Kuiper 10:59, 22 July 2010 (UTC)
Puzzled
editI have followed the public discussions on this subject, starting from the unfounded allegations around what is on Wikimedia Commons and ending here, and a few things puzzle me:
1) Why is the image more threatening than the word? So far as I can see, not many object to a set of words that describe an activity or thing that could be potentially obscene, but a lot more people are concerned about the image that is associated with those words. I'm just curious as to how - or why - there is a separation between these two things, and how rating systems typically deal with the problem.
2) I've tried to understand how internet material filters (or labelling systems) such as the one Craigslist uses (http://www.w3.org/PICS/) connect to censorship. Are rating systems inherently incompatible with free speech (i.e. are there none considered compatible with strong anti-censorship principles)? What is the view on Craigslist and PICS for instance?
Just trying to understand. --Aprabhala 13:55, 22 July 2010 (UTC)
- In response to your question2, filters and censors (both either automatic or human) must make a selection of which images to show and which to block - this is their entire raison d'être. The selection process is inextricably linked to personal opinions, cultural norms and other societal factors of what is and isn't suitable for someone else to see, and a value judgement must be made about each image to determine where it fits. For example if you wanted to block images of say "nudity", you or someone else must determine what constitutes nudity, and people from different era, cultures, religions, and even different opinions with the time, same religion and culture have different opinions about what constitues nudity. For example, I doubt whether we could get 100% agreement, even just among people viewing this page about which of the following images contain nudity:
If the filter decides that all the images should be filtered, but you believe that at least some should not be, then you would complain the images have been unnecessarily censored. Thryduulf (en.wikt,en.wp,commons) 15:23, 22 July 2010 (UTC)
- Well this is part of the reason why ICRA/PICS uses more descriptive labelling than just 'nudity' of course. See their vocabulary, with in addition to that a 'context label'. TheDJ 16:39, 22 July 2010 (UTC)
- TheDJ - I've had a look and those descriptions are so vague and culturally dependent as to be laughably useless! For example the last image above would have to be tagged as "no nudity" (her breasts are not exposed, her buttocks are not bare and her genitals are not visible) and yet that's one of the images in that group most likely to offend! 17:14, 22 July 2010 (UTC)
- Thryduulf, this is very useful. But, I imagine that for a place like Craigslist, the metadata is generated (or at least monitored) by site admin and not users. On Wikimedia sites, if the metadata is generated by the users (i.e. discussed in the same manner as everything else), at least for that set of users, wouldn't that create some 'answers' as to which of the images above are to be rated - surely not perfectly, but just as well as anything else that is discussed on Wikipedia? I also don't fully agree that different interpretations of nudity are a problem that cannot be handled; at least in as much as different interpretations of Israel and Palestine are not a problem on Wikipedia. This kind of thing happens all the time and there's a working balance that's achieved. In some cases, like with 'images' of the prophet Muhammad, different language Wikipedias seem to have their own policies (debated like crazy, but they do). To get back to the original questions though, I don't understand why the image is more offensive than the text, and I wonder if the problem of arriving at ratings is not something that's already being practised on every single page across Wikimedia - to the extent that you can substitute ratings for quality, notability, neutrality, etc.--Aprabhala 17:04, 22 July 2010 (UTC)
- While you can word text to make it clear that some people believe X about Israel/Palestine but other believe Y, you can't do that with an image - it either is something or it isn't something, a map either shows Palestine existing or it doesn't. Thryduulf (en.wikt,en.wp,commons) 17:14, 22 July 2010 (UTC)
- As a follow up, and on reading David Gerard's postings on foundation-l, is there a fundamental problem with aiding filtering/monitoring software to do its job well? What I mean is this: existing images on Wikimedia have tags on them. These tags may or may not be helpful to monitoring software (which I don't know much about but NetNanny is what one hears). If new user-generated metadata is created around those images merely to enable monitoring software to do its job (and leave the experience of those not using it unaffected) then do the cultural nuances matter as much? Somehow I think that the user who is offended by the sex & violence on Wikimedia won't mind if their filtering software errs on the side of caution (i.e. perhaps classifying partially and fully exposed torsos, male and female, as one thing, etc.) And for everyone else (like me, for e.g. who thinks that most of the articles related to matters sexual are usually well written and evenly phrased) well, there's no need to use filtering software, no? So I get the point about how ratings as they exist now will always result in some kind of goof-up (aka censorship), but if the people using the ratings are (not us but) those who use NetNanny anyay, then: is it censoring Wikimedia to allow a filtering/monitoring programme to do its thing on Wikimedia? [And... this is the problem with not knowing much... is the problem at hand now that the filtering software doesn't work on Wikimedia because of the way we classify things?] --Aprabhala 19:26, 22 July 2010 (UTC)
- If we had filtering settings I would set my home PC to allow all of the above set, but if I were using commons imagery at work I would like to be able to set my tolerance settings at "beachware", however if Islamic users wanted to set their tolerance levels to filter out anything other than hands I have no objection to having the categories that would enable this. WereSpielChequers 07:13, 27 July 2010 (UTC)
Per-user content filtering
editI know that we are currently still in an information-gathering stage, but after reading the questions, the mailing list and the responses so far (Along with some divine inspiration of my own) i came up with a crude solution which may or may not be a good idea (Just one way to know - writing it down!).
The problem
editBefore i get into the actual idea, i compiled a short overview upon the conclusions i based the idea on. As they say - if the schematics are wrong, the result will inevitably be wrong as well. (And note that, as an en.wiki Wikipedian my entire basis will be build upon experience in regards to that Wiki). But on to the actual text.
The problem we have content that may, for various reasons, be offensive to single or entire groups of people. Such offensive content is diverse and covers the entire spectrum from nudity to propaganda, and from religion to personal feelings. The difficulty is that there is no set pattern - every user is an individual with his own biases and opinions. As a result we cannot set a top level filter without hurting some other users, and equally we cannot allow certain content without insulting others.Either way, someone isn't happy.
The second issue is that there is no middle ground. We either have the images in an article, or we delete the images from an article. If they are included everyone can see them, but this includes people who do not wish to see them. If we delete them no one can see them, but that would equally include users who may not be offended by the content. Thus: We cannot exclude certain categories without calling in a censorship or bias issue - even if we did exclude images we would still be dealing with the fact that offensiveness is personal. Who could determine what others can or cannot see? And who could do it in such a way that we won't end with a bias towards the Wiki's geographical location? I would say that no-one can be granted such control on a Wiki - not admins, not stewards, not users. None.
The idea - individual content filtering
editIn my eyes the only way to deal with this issue is trough giving a choice to every individual user a choice as to what they want to see. By giving each user a choice of their own we could likely create some middle ground where most people are happy. My idea is a content filtering system that each user may adapt to his or her own wishes. To do that we add a tag-tree like structure that details the content of the images, and give users an interface which allows them to block trees, sub-tree's or nodes. Foe example we might have a top-level tree called "Nudity", with two sub-trees called "Sexual nudity" and "Nonsexual nudity", with each having a node (For example "Sexual intercourse" in sexual and "Topless" in nonsexual). If a user would select to block "Nudity" this would block that tree and every subtree and node. If they select to block "Sexual nudity" it would block that subtree and the "Sexual intercourse" node, but leave images containing nonsexual nudity alone. Of course selecting "Sexual intercourse" alone would block only images containing that content.
One positive aspect of this structure is that the current categories on commons might be a great starting point or perhaps even sufficient to implement such a system. In that case we only need a front-end to surround an already-present system, and some modifications to the code that loads images (Saving a great deal of time over starting anew)
Advantages
edit- The fine control would allow per user settings, thus giving direct control to each user as to which he wants to see.
- Similar systems already exist, so we won't have to invent the entire wheel - we just tweak it to our needs.
- The tree-like structure allows users to quickly block entire content area's, with the subtree's and nodes allowing more precise control if desired (Thus speeding up the settings)
- Settings can be saved for logged-in accounts, which would make this a one-time action
- Blocking certain unwanted images means less bandwidth, and less bandwidth equals less costs :P.
Disadvantages
edit- While per-account setting saving is possible, per-IP would be hard due to dynamic IP's. At most we could use a persistent cookie or something similar, but even then it would limit itself to a computer - and not a user.
- Users must know about the system in order to use it. Not everyone is computer-savvy so the system has to be fool-proof, which is difficult.
- A lot of editors land on an article page and not on the front page (For example, trough a google search), which means that unwanted images may be displayed before an editor can change the settings.
Challenges
edit- Would such a system allow unwanted censorship, where users can force settings upon others?
- Will users understand the system, and will it actually be used?
- Is commons current categorie system usable for such a system? If not, can it be easily adapted? (Or would we have a backlog of 7 million pictures to tag?)
- Would commons users and admins spend their time "stub-sorting" images for the system?
- Could this system be easily integrated with local uploads (Only free media is on commons - non-free media is local on every wiki).
I made a different proposal ([1]) because I think that the Achilles' heel to your approach is that it relies on random Wikipedia contributors to agree on a set of content tags. But who decides who gets to apply a given class of tags, or enforces sanctions when someone misapplies them? The arguments all land in the hands of a small number of admins with a lot of other things to do. This is not reasonable, especially when some religious and cultural viewpoints will simply have so many internal contradictions and inconsistencies that it really isn't possible for them to get together and rate content without them falling apart in discord. They have to have their own servers, their own power structure, their own rules and regulations, all under their own control, in order to do their job (or fail to do their job) as they want it done. Wnt 03:13, 27 July 2010 (UTC)
Comments
editGood idea? Bad idea? Fantastic idea? Ridiculous idea? Easy to implement? Impossible to implement? Usable, Unusable? Whatever your opinion, please voice it. Excirial (Contact me,Contribs) 21:36, 22 July 2010 (UTC)
- all very sensible :-) - and a bit similar in flavour to ideas previously raised. Is this the sort of proposal you (Robert) find useful to raise and discuss? Privatemusings 00:12, 23 July 2010 (UTC)
- Sounds interesting. You could offer users a front end with a few preconfigured selections to make it easier for them (no Mohammed, suitable for under-12, etc.). --JN466 02:05, 23 July 2010 (UTC)
I think it's a good idea, but I'm certain creating a parallel category structure for objectionable content will have vastly unintended consequences with the usual mass media suspects. I predict headlines such as, "Wikipedia Organizes Porn in Easy-to-Find Categories," and "All Pictures of the Prophet Muhammed Available on Wikipedia with Just One Click," etc. If we really want to do this the right way it should probably involve an entirely new database table and not the existing category schema, and not depend on extra Javascript and CSS that will slow down page downloads and only work on some browser configurations.
Also I'm concerned that the stated goal of "doing no harm" has morphed into what is evidently nothing more than a public relations CYA exercise, and I guess the discussion questions clearly imply that "Controversial Content" actually means "Controversial Images," with no regard to text at all, don't they? 71.198.176.22 03:28, 23 July 2010 (UTC)
- Any change will inevitably invite negative feedback, as anything can be bend to suit ones needs if enough pressure is applied. A kitten rescued from a drain pipe can be described as "Wasted money: How a single kitten cost the taxpayer over 150.000 in rescue efforts". We can also be sure that some people will still complain that we show certain content on the basis that they feel that no-one should see it, while the other side can complain that we included the means for censorship. In other words, it is nigh impossible to please the edges of each side, and neither should we try.
- As for the "Image only" spectrum, it is understandable as the impact of these is much larger. They are either present, or they are not present, with no in-between where users can choose. Text on the other hand, is easily adaptable and thus it is easier to create an alternative that is acceptable (Neutral) for most people. Excirial (Contact me,Contribs) 08:11, 23 July 2010 (UTC)
In theory this is a good idea. However, in order for it to be anything other than pointless by filtering too much (in which case people wont use it) or too little (in which case people wont use it) it has to offer very fine control over what people want to see (even "naked male buttocks" isn't fine control enough, as people may be fine with it in an ethnographic context but not in the context of a streaker at a football match (both equally non-sexual) and it doesn't define whether buttocks covered in body paint are naked or not. Similarly at what stage of development do girl's breasts have to be before they are classed as topless (given that toddlers generally aren't so classed, even when they have no tops on) and in some cultures it is based on age rather than state of breast development. Does a woman wearing a bikini top that covers her nipples but not the majority of her breasts have "exposed breasts" - in Saudi Arabia she probably would but in America she probably wouldn't? Then you have to ensure that every image is tagged for all content (e.g. some people will not want to see spiders, so you need to make sure that every image that contains spiders, even in the background, is so tagged). You can't rely solely on categories either, for example there is part of the hierarchy that goes Sex - Chastity - Virgin Mary [...] Churches in the Czech Republic dedicated to the Virgin Mary. So if you block all images in the "Sex" category and its subcategories you also block the pictures of churches. If you also whitelist the category "Churches" then you create a conflict as the "Churches in the Czech Republic dedicated to the Virgin Mary" is both blocked and allowed and it obviously cannot be both.
To summarise, descriptive image tagging is a Good Idea and should be encouraged, but until every image is so tagged using it for filtering is just not practical. To make a filtering system work for our purposes it needs to be as finely controllable as possible, but this unavoidably means it is complicated and time consuming to set up, which will put off the people most likely to want to use it. Thryduulf (en.wikt,en.wp,commons) 12:59, 23 July 2010 (UTC)
Counter-proposal
editI just thought of another idea that might be done which wouldn't impose so much overhead, and plausibly falls into the acceptable range of user customization settings. Rather than tagging images at all, one supposes that a user might choose to run a custom skin which can set up a blacklist of unwanted images. Images on the blacklist would be collapsed by default. The user could choose to expand and view one of these as desired, and could blacklist or un-blacklist any image with some tabbed feature. Ideally the software would give the option to export the blacklist to a user subpage, and import blacklists from other users' subpages and merge them; if the organization has a certain hierarchy it might thus accumulate very large lists (if user client software processing speed allows it, that is). In this way a group of Christian or Muslim or weak-stomached users might collect lists of unwanted images, which they could use to avoid personal distaste. The feature would not and should not have any major use in censorship against other people's reading, except in that the blacklists generated by a group might eventually be used by some third-party censorware to help build its lists of images to block. Wnt 02:03, 28 July 2010 (UTC)
Controversial content inside Wikipedia
editI'm not sure to understand why you will focus on images, or maybe I should read "controversial" as "problematic for occidental mainstream media"?
If you want to have a first look at what are the controversial content inside Wikipedia, you should have a look at those scientific study on that subject: What’s in Wikipedia?: Mapping Topics and Conflict using Socially Annotated Category Structure & He Says, She Says: Conflict and Coordination in Wikipedia
Thus, watch out for contents about: religion, philosophy, people, science, society and social sciences, history.
References:
- Aniket Kittur, Ed H. Chi, and Bongwon Suh, What’s in Wikipedia?: Mapping Topics and Conflict using Socially Annotated Category Structure. In Proceedings of the 27th international Conference on Human Factors in Computing Systems (Boston, MA, USA, April 04 – 09, 2009). CHI ‘09. ACM, New York, NY.
- Aniket Kittur, Bongwon Suh, Bryan A. Pendleton, Ed H. Chi, He Says, She Says: Conflict and Coordination in Wikipedia. In Proceedings of the 25th international Conference on Human Factors in Computing Systems (San Jose, CA, USA, April 28 – May 3, 2007), CHI’07. ACM, New York, NY.
Good luck. nojhan 13:31, 23 July 2010 (UTC)
Ways to solve this while preserving NPOV/NOTCENSORED
edit- Do nothing, but leave it up to the regular internet filter people. There's a whole world of people trying to protect other people from potentially offensive content. We don't need to re-invent the wheel, we don't need to build a wheel, we don't even need to provide a wheel. All we need to do is say "Wheels exist, if you want one, go google it".
- Allow voluntary verifiable/Descriptive labeling combined with a Wikimedia-provided voluntary filter.
- Let like-minded individuals collaborate to produce different subjective ratings-- but allow for a diversity of such ratings, none of them 'official'.
- Start allowing a greater diversity among the projects. The reason Wikipedia CAN'T be censored is that it's the only Wikipedia we have. But nothing is stopping us from making new projects that have their own policies. I think the future of the Wikimedia movement will see us hosting THOUSANDS of such projects, each with their own unique topics, readerships, and editorial policies. Muslim readers, for example, have expressed displeasure that they have no "project of their" own right now, not even in their own languages-- Well, why not try let them try to build one?? The beginnings of a kid-safe project? Why not try it? And then, when there is diversity of projects, any GIVEN project become less mission-critical.
- Lastly, if somehow we absolutely had to have a censored project, then they could always rename the current Wikipedia to "Uncensored Wikipedia" or something, to underscore this fundamental aspect of its policy. The new non-offensive Wikipedia can argue over offensiveness, while the rest of us continue editing unfettered by such concerns. (Though I feel "Uncensored Wikipedia" should have the claim to the Wikipedia name, but you get the idea). --Alecmconroy 17:35, 24 July 2010 (UTC)
- Well, a lot of ways to tackle this issue, so lets see..
- 1) Quite a valid option. Internet filters already exist, so it should be possible to modify them in a way that it blocks certain images. One concern is that certain images may not be blocked on default, which may lead to lots of manual configuration (And some users aren't exactly computer savvy). Another problem is that we constantly update Wikipedia, which may leads to unblocked content every now and then. Still, its a fine idea.
- 2) Well, i can be brief about this one - see my own section up above which deals with this issue.
- 3) I presume such ratings would be used in combination with a normal filter? Even so i would point out that subjective is generally a bad thing - it makes collaboration between editors difficult, which is a bad thing since we would have 7 million images to rate. The amount of effort and the cost of maintenance would likely defeat its purpose. I would be surprised if we will have one half-finished rating some years from now. If we used an objective measure people could at least cooperate on such a venture.
- 4) bad, BAD idea. We already have an example of this (Conservapedia), and we all (Well, can't really speak for others but myself but you get the point) know how neutral and encyclopedic that project is. People are free to fork Wikipedia for their own initiatives, but let them do so without Wikimedia support. One of the strengths of Wikipedia is that we have editors from all around the globe working on the same content, which enforces neutrality due to constant pushes from both sides. If we start splitting on the basis of religious, cultural or political believes we would get a fine mess without neutrality. Besides, whats the point? We would get a ton of encyclopedia's detailing the exact same content in the exact same language. And who would decide what's worthy of a Wiki?
- 5) Bad idea as well. Encyclopedia's are valuable because they provide reliable, sourced and neutral information. NPOV, and in part NOTCENSORED, are therefor a critical aspect to any encyclopedia. We would just end up with 1001 projects detailing the exact same thing, with slight-to-major differences in bias. We are here to build an encyclopedia, and not to create 1001 projects where a handful of people can sprout their idea's as if it were blogs. No one would be interested in reading a ton of small projects, and no one would be writing other then a handful of people. And people would still be complaining about the other Wiki's. Excirial (Contact me,Contribs) 09:04, 25 July 2010 (UTC)
comment on 1
editWell we're already NOT doing nothing - this whole conversation is wiki doing SOMETHING :) But if I could extrapolate this point, we could do nothing to stop hosting material and yet still be active ... As you say there must be many, many, people out there trying to overcome this issue out there - after all these aren't the only sites that have to address this issue. We could seek out a partnership, seek out all the filter software makers and work with them. I haven't done any research yet I'm afraid but there must be some ideas out there, we could at least work with and conform to any schemes in use or development. If there is a decent open-source one all the better, it would be much more efficient into putting any efforts into a content filtering system that could be used elsewhere. We might even be able to incorporate such software into mediawiki software to provide the per-user filtering directly. 20:00, 29 July 2010 (UTC)
Medical imagery
editI am not really into censorship, but please consider medical imagery as well, if you are going to censor anything on Wikipedia. I have seen quite a few people complainig in cases when such images were featured on the main page that they made them throw up. And while I myself don't feel all that disgusted upon viewing them I still preffer to turn off images in my browser when I browse articles on medicine, read captions and then choose, if I need to see that. I figure it wouldn't be that hard to implement filters by each users prefereces - consider that there allready are possibilities to hide stuff on wikipedia (such as collapsible templates), I am not an expert, but perhaps all that is needed is a new css class with options to label stuff by particular group e.g. I don't want to see medical images by default, so when I open article I would see a description, if I wish I click to see it, there could also be another option to benefit of worried parents with which you could block optio to view iappropriate images at all (perhaps certain IPs could also be blacklisted , so schools and such wouldn't block access to wikipedia). No matter how much such images could add to understanding the subject the primary goal of an ecyclopedia should be to give people information they need, not to make them throw up, because of images they never would have wanted to see, that's what shock sites are for 81.198.50.44 15:09, 27 July 2010 (UTC)
- Reality is shocking. If you don't want to be shocked, stick to fairy tales. WAS 4.250 20:23, 27 July 2010 (UTC)
- Actually, fairy tales are pretty shocking too - perhaps daytime tv? ;-) Privatemusings 21:17, 29 July 2010 (UTC)
- Wikipedia is not, and should not, be a shock site. "Click here to view graphic image" links are fine and it's not censorship. Comet Tuttle 17:51, 30 July 2010 (UTC)
- As I described at #Counter-proposal above, I think that a setting to allow people to hide certain images might be acceptable, provided that they accept responsibility for managing their own blacklists as individuals and don't try to define moral categories on Wikipedia or pull one another into block proceedings for miscategorizing images. But hiding images from everyone creates problems for such groups as those with carpal tunnel who hate having to make extra clicks, people with browsers with scripts turned off, or people who quickly print out the Wikipedia article to read later. If the image is relevant to the disease, it should be in the article. Wnt 00:15, 4 August 2010 (UTC)
- Recently I happened to watch a portion of Jaws, and was surprised to recall a very peculiar but inspired scene, in which the mother is simply reading a book about sharks, but through the use of music and cutting back and forth to her children playing on a boat it is made dramatic. The scene includes her looking at several pictures of the aftermath of shark attacks, apparently genuine though I can't swear to it, including one of a man who had had the entire back of his thigh torn off. Now Jaws was rated PG - in other words, by a formal mandatory "voluntary" national rating scheme, it was deemed suitable for children of all ages, not just over 17 or over 13, provided a "parental guidance" (not presence) is provided. So I think we should bear in mind that there is certainly no longstanding precedent for the idea of requiring gruesome images to be removed from children's sight. Wnt 16:00, 8 August 2010 (UTC)
root of the problem is when the project became a vehicle for anti-censorship activism
editDuring a discussion on Wikipedia I noted that linking to Wikileaks is an offence in Australia, punishable by fines of up to A$11000 per day and that the argument for the project pursuing a policy of maximal global compliance with legal norms (which is generally our policy when it comes to copyright) is just as presumptively valid as the argument for minimal compliance. A user responded "censorship is bad, m'kay? We should comply with censorship laws to the minimum extent we can get away with" and when I pointed out that that's "a non-neutral value judgment", the reply was "you're damn right it's a non-neutral value judgment."
This, in a nutshell, is how the project has ended up, or is heading, offside with too many stakeholders in society at large. Wikipedia took a stand:WP:NOTCENSORED. This policy has made it more difficult to resist the degradation of Wikipedia into a Encyclopedia Dramatica. The declaration that "Wikipedia cannot guarantee [that it] will adhere to general social or religious norms" is, the hands of interpreting editors, itself a normative statement. It is a decision to minimally comply with social norms, to actively resist and blaze its own trail instead of playing the role of passive follower of the mainstream.
The solution, in my view, is stop crusading on the matter. That doesn't mean indulging partisans on the other side of the issue. It means spinning out the decision point. This is essentially what Wikipedia tries to do by adopting external reference points for things like whether something is notable. The decisions of other media matter in that they create the material that allows Wikipedia to go with preferred secondary sources instead of primary. Consistent with that philosophy is following the lead of the secondary sources with respect to how to treat the primary, instead of taking a position internal to the project. If an image has been carried by a "mainstream media" outlet, that should constitute a good argument for collecting images of that sort. If a certain type of image is overwhelmingly not found in either mass market or "high brow" publishers, why should Wikipedia (which surely aspires to be more "high brow" than low) buck this external consensus?
Wikipedia's reliable source policy tries to prioritize "authoritative" sources, by, to take an example, saying that a claim made by a source which is subject to editorial control is more reliable, all else equal. Wikileaks seems to have figured out the value of this by working with media outlets that have respected editors like the New York Times instead of just dumping its Afghanistan war logs for the "Internet" to sort out. As it stands now, Wikileaks is holding back some the material it has, despite its philosophy of maximal disclosure. Why this self-censorship? Because if it dumped the whole lot when the overwhelming majority of mainstream outlets would have exercised more discretion, it would marginalize itself and get itself blacklisted by too many reasonable people. The Wikimedia project should take note of this lesson.Bdell555 10:20, 28 July 2010 (UTC)
- May I note that this is all about Bdell's proposition to remove the link to the website. Background context is here. Enjoy, because I've had enough. Ericleb01 14:02, 28 July 2010 (UTC)
- No. I was actually the first editor of Afghanistan War Logs to ADD "the link". What I have a problem with is the POLICY rationales that have been proffered for having the link. When another editor reminded everyone that the New York Times said it wasn't linking and that it had "withheld any names of operatives in the field and informants cited in the reports" I considered that a serious factor and urged others to consider it as well. But the whole debate kept coming back to "censorship is bad, m'kay". The fact that we now know that the link is to material that includes "the names, villages, relatives' names and even precise GPS locations of Afghans co-operating with NATO forces" ought to be an occasion to have a deeper think about WP:NOTCENSORED. As if Wikipedia would really be so diminished sans pornographic photos and videos (parental advisory). For what it's worth, more "background context" is also available here.Bdell555 02:09, 29 July 2010 (UTC)
- I don't know anything about the specific case in question, but there are only two possible ways that NPOV and censorship can interact:
- Censor nothing (meaning "nothing", no exceptions)
- Censor everything (meaning "everything", no exceptions, not just things that you find offensive, not just things that $other_person finds offensive, not just what either of you or a hypothetical third person might find offensive, everything).
- As option 2 would result in an encyclopaedia of 0 articles, a media library containing 0 files, a dictionary that contains 0 words, etc. it is not practical. Thus the only way that you can you can apply NPOV (which is, remember "non-negotiable") is not censor anything. In practice though all the projects have to bend the NPOV rule slightly - we censor things the laws of the United States and Florida tell us it is illegal not to censor (as this is the POV of the relevant lawmakers it isn't NPOV). Anything more than that and we might as well abandon the idea of neutral point of view. So, no, w:WP:NOTCENSORED does not need rethinking at all, it just needs to be enforced. As I said way up thread, the only policy that Commons needs to have about "controversial content" is "Commons is not censored". Thryduulf (en.wikt,en.wp,commons) 20:58, 29 July 2010 (UTC)
- I don't know anything about the specific case in question, but there are only two possible ways that NPOV and censorship can interact:
- No. I was actually the first editor of Afghanistan War Logs to ADD "the link". What I have a problem with is the POLICY rationales that have been proffered for having the link. When another editor reminded everyone that the New York Times said it wasn't linking and that it had "withheld any names of operatives in the field and informants cited in the reports" I considered that a serious factor and urged others to consider it as well. But the whole debate kept coming back to "censorship is bad, m'kay". The fact that we now know that the link is to material that includes "the names, villages, relatives' names and even precise GPS locations of Afghans co-operating with NATO forces" ought to be an occasion to have a deeper think about WP:NOTCENSORED. As if Wikipedia would really be so diminished sans pornographic photos and videos (parental advisory). For what it's worth, more "background context" is also available here.Bdell555 02:09, 29 July 2010 (UTC)
< I'd really recommend taking a look at Ting Chen's posts to foundation-l on this matter, Thry. I think his points are really worth thinking about, and I think they might help you broaden your engagement with this issue - take a look :-) cheers, Privatemusings 01:01, 30 July 2010 (UTC)
- I am currently of the opinion that any censorship at all (beyond legal requirements), is a complete violation of NPOV. I haven't the slightest clue how one could think censorship for a certain group and NPOV are in any way compatible. Yet this discussion about "controversial content" (which, unless I'm mistaken, is all of Wikimedia content), is ongoing, and has been for a while. I've read both of those posts, hoping they would shed some light on the matter, and I still don't have a clue what's going on. My thought process: 1. NPOV includes neutrality. 2. Neutrality includes not favoring one culture, group, or person over any other. 3. Censoring/removing certain content in order to prevent offending certain people while not censoring/removing certain content for any specific other individual or group constitutes favoring the first group over the other. Voila, we arrive at NOTCENSORED. Any censorship is completely disallowed where possible. Can someone please explain to me where I got this wrong? --Yair rand 06:21, 30 July 2010 (UTC)
- I think the point being made is that 3) is synonymous with the practices and processes of every wiki - whether you call it censorship or editorial judgement is a matter of perspective. It's important to be able to see the greys, I think :-) Privatemusings 06:44, 30 July 2010 (UTC)
- I've read those posts and I still don't understand how you can say that censorship and NPOV are at all compatible. A lot of what those posts contain are apparent case studies of examples of editorial judgement (e.g. not including something that is not verifiable is not censorship, nor is a medical journal only accepting medical articles), and the actions of the Chinese in 1968 is not at all relevant. The point he makes about the en.wiki treatment of child pornography is a good one though - the absolute non-tolerance of any discussion about it is not in accordance with NPOV and at the time that moral panic started I was one of those very strongly arguing that the best way to achieve the aims of not encouraging it was to treat it like any other subject and write solid, factual NPOV articles about it and it's effects. And before you say this is a legal aspect, it would be illegal to host and display images of child pornography as defined in the US, but perfectly legal to write an encyclopaedia article about it. Equally just because it is legal to show an image (e.g. the Mohammed cartoons on ar.wp) doesn't mean that choosing not to show them is censorship. Censorship is preventing other people viewing them and not allowing them to make the choice to see or not to see something. This is why descriptive image tagging is good in theory, but when you get down the practicalities of it is is a lot harder to actually implement, as you have to get very, very detailed to avoid any cultural biases (which we all have). Thryduulf (en.wikt,en.wp,commons) 07:01, 30 July 2010 (UTC)
- (edit conflict) I'm afraid I am not able to see the greys. Whether censorship has not been entirely eliminated on Wikimedia yet is irrelevant, I think. If it favors a particular viewpoint then it's non-neutral, otherwise it's fine, isn't it? It seems very black and white to me. How is it possible to have any ambiguity? --Yair rand 07:06, 30 July 2010 (UTC)
- Do you think if Autofellatio passed featured article review that it would ever be promoted on the front page of the English Wikikpedia? The point is Wikimedia projects have always been subject to editorial restraint (which is now tilting towards all information that is not illegal to reproduce in the US should be included somewhere on WMF servers), or censorship if you will. That is what Ting Chen's thoughtful commentary is addressing. 62.58.108.142 04:35, 31 July 2010 (UTC)
- Well, our German counterparts included the Vulva article on the frontpage, including a picture - also see This signpost issue (SFW, as long as you don't click the "Teaser" link). Personally i think this is one of the cases where some restraint could be exercised; Possibly controversial images are fine, but (My personal) criteria is that they should only added in place where a user should expect them. On a second note i would point out that creating a FA article is a major achievement for a lot of editors, which means that banning them from the main page altogether could be negative as well. In short - i wouldn't be against adding "Autofellatio" to the main page per se, but it would probably be best to do so without an image. But as PrivateMusings already said - this is really one of the Grey area's. Excirial (Contact me,Contribs) 12:59, 31 July 2010 (UTC)
- Do you think if Autofellatio passed featured article review that it would ever be promoted on the front page of the English Wikikpedia? The point is Wikimedia projects have always been subject to editorial restraint (which is now tilting towards all information that is not illegal to reproduce in the US should be included somewhere on WMF servers), or censorship if you will. That is what Ting Chen's thoughtful commentary is addressing. 62.58.108.142 04:35, 31 July 2010 (UTC)
- I think the point being made is that 3) is synonymous with the practices and processes of every wiki - whether you call it censorship or editorial judgement is a matter of perspective. It's important to be able to see the greys, I think :-) Privatemusings 06:44, 30 July 2010 (UTC)
- Censoring a link to the Wikilinks report is absurd. Every time anyone, even Bdell, says "Wikileaks", they've given anyone who's ever heard of something called a "search engine" enough information to find that Afghan War archive. Bdell even provides a link - to w:Wikileaks, that is - but what's the difference between linking to an encyclopedia article that links to the Wikileaks URL that links to the IP address, versus linking to the Wikileaks URL that links to the IP address?
- Of course, I can't rule out that Bdell actually will get prosecuted for that... Australia has famously declared, for example, that small-breasted women in their late twenties are "child pornography", and maintains a system of prior censorship similar to that of China except, of course, for being less technically sophisticated. Obviously all the content from an encyclopedia in a country with free speech won't be allowed in a country without it. But between Australia and China, I think the People's Republic would have a stronger case by Wikipedia policies.
- For example, Wikipedia hosts descriptions of the w:Tiananmen Square protests of 1989 that quote retracted reports, anonymous sources, foreign intelligence agencies and so on to allege that the living people of the 27th Army perpetrated a massacre that the Chinese government says didn't happen. This may well defy the ever-stricter BLP regulations on Wikipedia, reflecting a double standard toward China. Likewise Chinese sources might argue that ongoing coverage of controversial issues in Tibet or Taiwan furthers an American destabilization campaign and could help foment violence, which probably isn't as absurd as the Australian idea that if Wikipedia doesn't link to Wikileaks the Taliban won't ever read the report. But it's widely accepted that even American companies in China are not willing to go along with the Chinese censorship regime - so I don't think we should give a second thought to what the Australian government decrees. They've made themselves the pariah and the laughingstock of the Internet, and that's all there is to it. Wnt 00:06, 4 August 2010 (UTC)
The view of a parent
editA few months ago I found my son looking at what were essentially pornographic photographs on Wikipedia and Wikimedia Commons. This shocked me, as I had considered Wikipedia to be safe, albeit slightly unpredictable for accuracy. I took it upon myself to see that some of these photographs be deleted, and nominated some for deletion on Wikimedia Commons. These photographs were vigorously defended by users who argued they were educational when I believe not all are. Four photographs of a naked woman spraying herself with whipped cream exist here, differing only slightly from her posing and the amount of cream. Others show several semi-naked women together, differing only slightly from the angle the picture was taken from. More still look to have been lifted from the pages of magazines from the top shelves of petrol stations. If some of these photographs are educational, I believe their is no need to keep four very similar copies. It needs to be looked at whether a photograph of a (fake-breasted) topless model is really going to ever be used educationally.
I argue that some assessment needs to be made about what is kept on these websites and how useful it will be. While these sites may not be censored, it shouldn't mean the content is not carefully checked to ensure it is relevant, educational and beneficial for everyone who uses these websites.
Cleaner1 21:30, 2 August 2010 (UTC)
- Without knowing the exact images you are talking about, I cannot comment specifically on them or their specific usefulness, however if the images are shot from different angles different things might be seen/not seen in the image and thus they might be useful for different purposes (even if image 1 and image 2 are never used in the same place image 1 might be perfect for an encyclopaedia article about X and image 2 perfect for a dictionary entry about Y), or image 1 might have been chosen to illustrate the English Wikipedia article about X and image the Swahili Wikipedia article about the same subject - which one should be deleted as a duplicate of the other, and why?
- The answer to the implicit question in your comment "It needs to be looked at whether a photograph of a (fake-breasted) topless model is really going to ever be used educationally." is yes, it is likely that such a picture will be used educationally - for example it could be used in encyclopaedia articles and/or academic studies about the model in question, about fake breasts (in general, in $culture, comparisons between cultures, comparisons between different types of fake breasts, about the issue of fake breasts in pornography), about the photographer who took the picture, about the publication/website they appeared in/on, about elective surgery more generally, about body image, etc. It could also be used to illustrate a dictionary entry about fake breasts, a text book for medical students that deals with fake breasts in any of several ways (elective surgery, body image issues, health issues, etc), perhaps it would be useful in literature given to patients before or after they have this type of surgery, maybe it would be useful for students in sex education lessons, or perhaps anatomy lessons. So an image you dismiss out of hand has more potential educational uses than an image that you would probably not think twice about, such as File:Isolatorketting.JPG which in having >0 educational uses (pylons (general, specific types or locations), insulators (ditto), the specific material used, the shape of the objects, electricity, conductivity, power lines (construction, maintenance, demolition), the manufacturer or designer, all for different audiences. Thryduulf (en.wikt,en.wp,commons) 00:52, 3 August 2010 (UTC)
- Besides the points raised above, i would point out a few more difficulties and issues when talking about pictures.
- An often heard comment - you voice it as well - is that there should be some editorial judgment as to what pictures should and should not be included on Wikipedia. Of course there is a limit to the content that is allowed, but that limit is mostly a clear set of rules. For example, copyright infringements and fair use are regulated by law, as are images that are illegal. However, it is much harder to quantify offensiveness. Is an image of Muhammad offensive? tr sure is perceived as such among a large amount of humans, though others don't really mind. Same with sexual images; Some people are gravely offended by mere nudity, and other don't even rise an eyebrow at strong pornographic content. The difficulty is that removing one set of images means depriving other people of such content.
- A second point i would make is that content - and especially controversial content - is (Or should be!) placed only on pages where a user should reasonably be expecting it - in other words, if you don't search for pornographic content it is quite unlikely you find it. Even on pages with possibly controversial content the controversy is limited to the pages subject area. For example, the page on nudity only shows this - nudity. It shows no sexual intercourse or other subjects that fall outside its scope. The sexual intercourse page shows some paintings and non-suggestive mating among animals. In other words, the severity of the content is limited to the severity of the subject. You will not find explicit pornographic content unless you are looking for it.
- As said before, Wikipedia strives to be an encyclopedia that encompasses all human knowledge, which includes knowledge that most parents would deem inappropriate for children (Which is a very limited percentage though). In this case i would point to en:Wikipedia:Advice for parents, a page that aims to give some aid to parents. For example, there is a seperate non-wikipedia project called School-Wiki which aim is to create a child safe encyclopedic environment using content from the English Wikipedia.
- As mentioned before - one won't find explicit content unless one searches for it. I would point out that, even with the most stringent filtering on Google, it is remarkably easy to fins such content if man intentionally looks for it (Mostly because Google only bars the first results - links on other pages are not filtered). I would equally add that this filtering is completely useless if one navigates directly to a page with questionable content. I would argue that, if someone looks for such content directly, it is generally better to land on an Encyclopedia giving an objective overview, rather then on a site which merely exists for other purposes. I would also point out that the rest of the internet contains seas of questionable content - Even if you are not looking for it you can be confronted with such material. In this case Wikipedia has the advantage that it won't show pornographic advertisements or "trick" users into clicking questionable links. But if you wish for your children to be (virtually) exempt from such material, it is generally best to invest in parental control software; especially if you are not monitoring Internet usage. Excirial (Contact me,Contribs) 19:07, 3 August 2010 (UTC)
I've commented about this issue at w:Wikipedia:Sexual content/FAQ: Wikipedia can't really be a child-safe site, and if the presence of such images raises parents' hackles and makes them more wary, that might actually be a good thing. I should also note that these (I think) are the specific images that have been mentioned in a policy proposal on Commons (See Commons:Commons:Sexual content#Categories). They're now indexed under Commons:Category:Whipped cream fetishism, but when they were first mentioned someone had impishly filed them under "Commons:Category:People eating". (Needless to say, this was not in strict accordance with Commons guidelines on categorization) The photos document the... actress Angelina Ash, whose article was deleted from Wikipedia in 2008 because it had not documented her notability. However, they are linked from the German and Japanese Wikipedias. It wouldn't be appropriate for Commons to delete the pictures, because (among other things) Commons is supposed to be a common image repository from all the Wikipedias: it's not up to Commons to dictate the mores of the English-speaking Wikipedia onto all the other countries. Wnt 23:45, 3 August 2010 (UTC)
any updates? :-)
editany updates would be greatly appreciated :-) - especially if they might help me / us understand what the tangible outcomes of this study will be in a few weeks / months time - presumably a written report? (how exciting would it be to draft that report in wiki space?!) - an indication of how conversations with others (librarians, teachers etc.?) are going and perhaps the scale (ten people, 100, two or three?) would also be greatly appreciated - plus of course any indication if you'd like further responses / reflections from editors watching this space :-) - I've been 'pinged' by a couple of people asking me for updates on what's happening here, I guess because they're aware of my interest, but a full and open response would prolly be better.
If it's all getting a bit dry and dull, take 5 minutes to take a look at Category:Whipped_cream_fetishism (not for children, or safe for work) - you can probably even bill for the time ;-) cheers, Privatemusings 00:17, 23 August 2010 (UTC)
Study Update
editRobert Harris here again, the consultant looking at the issues surrounding controversial content on Wikimedia projects. I wanted first of all to thank all of you who have taken the trouble to once again weigh in on a subject I know has been debated many times within the Wikimedia community. It has been very valuable for me, a newcomer to these questions, to witness the debate first-hand for several reasons. The first is to remind me of the thinking behind various positions, rather than simply to be presented with the results of those positions. And the second is as a reminder to myself to remember my self-imposed rule of "do no harm” and to reflect on how easy it is to break that rule, even if unintentionally.
So far, the immediate result for me of the dialogue has been to recognize that the question of whether there is any problem to solve at all is a real question that will need a detailed and serious response, as well as a recognition that the possibility of unintended consequence in these matters is high, so caution and modesty is a virtue.
Having said that, I will note that I'm convinced that if there are problems to be solved around questions of controversial content, the solutions can probably best be found at the level of practical application. (and I’ll note that several of you have expressed qualified confidence that a solution on that level may be findable). That's not to say that the intellectual and philosophical debate around these issues is not valuable -- it is essential, in my opinion. It's just to note that not only is the "devil" in the details as a few of you have noted, but that the "angel" may be in the details as well -- that is -- perhaps -- questions insoluble on the theoretical level may find more areas of agreement on a practical level. I'm not sure of that, but I'm presenting it as a working hypothesis at this point.
My intended course of action over the next month or so is the following. I'm planning to actually write the study on a wiki, where my thinking as it develops, plus comments, suggestions, and re-workings will be available for all to see. I was planning to begin that perhaps early in September. (A presentation to the Foundation Board is tentatively scheduled for early October). Between now and then, I would like to continue the kind of feedback I've been getting, all of it so valuable for me. I will post another set of questions about controversy in text articles on this discussion page, because my ambit does not just include images, and text and image, in my opinion, are quite different forms of content. As well, I will start to post research here I've been collecting for information and comment. I have some interesting notes about the experience of public libraries in these matters (who have been struggling with many of these same questions since the time television, not the Internet, was the world’s new communications medium), as well as information on the policies of other big-tent sites (Google Images, Flickr, YouTube, eBay,etc.) on these same issues. I haven't finished collecting all the info I need on the latter, but will say that the policies on these sites are extremely complex (although not always presented as such) and subject within their communities to many of the same controversies that have arisen in ours. We are not them, by any means, but it is interesting to observe how they have struggled with many of the same issues with which we are struggling.
The time is soon coming when I will lose the luxury of mere observation and research, and will have to face the moment where I will enter the arena myself as a participant in these questions. I’m looking forward to that moment, with the understanding that you will be watching what I do with care, concern, and attention. Robertmharris 13:12, 23 August 2010 (UTC)
Questions Surrounding Controversial Content in Text and Related Issues
edit1. It can be argued that most of Wikipedia’s editorial policies and practices (starting with NPOV and Verifiability) are designed both to stave off and then solve conflicts over disputed content within Wikipedia articles, ie deal with a form of “controversial” content. Do any other policies or guidelines need to be established to supplement existing practice? What might these be?
edit- NPOV and V are two policies that are utterly clear in their goal and purpose, while staying as close to neutral middle ground as possible. Additional policies would merely result in policy creep and bias. For example, how neutral would policies on "Sexual images", "Religious images", "Offensive image category X" be? If we start segregating offensive content into different categories with different rules we simply casting a judgment on content ourselves - something we should absolutely evade. Besides, discussions about offensive content already pivot around existing rules, trying to bend them to fit the argument, which will only be amplified with even more rules. And as i stated before - neutrality and verifiability can be measured fairly objectively - offensiveness cannot be measured. Excirial (Contact me,Contribs) 19:18, 23 August 2010 (UTC)
- The WP:NOT#CENSORED policy is also crucial, and rather insufficient for its purpose. The past three years have featured a phenomenon of "deletionism" on Wikipedia — a term with various meansings, including people who propose articles for deletion based on lack of notability, those who propose articles for deletion because they don't have anything nice to say about a living person, those who rapidly and repeatedly revert any attempt to add certain information to an article, etc. From my perception, the Wikipedia project suffered a sudden and severe blow with the rise of such elements in 2007, which broke its skyrocketing momentum and sent it on a decline from which I do not know whether it will recover. I think that a stronger spirit of inclusionism, and more respect for the contributions of previous volunteers, is needed on many issues. To me, the pornography debate represents the latest in a series of overwrought fears that have damaged Wikipedia. I recognize that this entire phenomenon has much to do with Wikipedia's success and high public profile, as the culture of librarians abruptly crashes into the culture of the "decency" advocates - a confrontation which happened in schools and public libraries only sporadically, but which is bound to come with full force as both gain easy communication with one another on the internet. But this is a confrontation that we must win.
- Keep in mind that a complex subject rarely has a singular explanation or cause associated with it. Based on your comment i would point out a few things:
- We filter content based upon what is encyclopedic. If we allowed each school band, company, made-up game, word, pet or person to have an article we would likely increase our edits substantionally, but i would ask you, does this improve Wikipedia's quality? Editcount alone says nothing - it is the value of each edit that counts.
- The inclusion criteria for article's is verifiability - criticism isn't shunned in BLP's, but there must be stronger-then-usual sourcing for criticism. Equally they are held to a higher level of neutrality (NPOV). Simply said, if a BLP is doing nothing but criticizing it is removed.
- There is a top-level of growth for any system. Eventually we will meet an area where the amount of edits equals the time people around the world are willing to invest into the project. Equally we have to keep people's goal in mind - a lower quality threshold generates a larger quantity while setting a high quality threshold generates a smaller quantity.
- As we write more and more, there is less and less people could easily expand on. It is a lot easier and more tempting (My option that is) to improve a stub class article to a somewhat decent form. Longer and higher quality article's tend to offer less room for improvement, and thus people may be incapable or intimidated to contribute to it, afraid that they mess up something that already seems fine. This can equally cause the edit rates to drop though there may be many factors.
- As an interesting sidenote i would point to the second image on the right which seems to show that the average amount of created (kept) article's per year is still increasing, even though our edit count is apparently stabilizing. So even with more deletions and less edits, we still see a slowly increasing yearly average in article creations. Excirial (Contact me,Contribs) 19:58, 24 August 2010 (UTC)
- Your curve is going up, but its inflection point is in 2007. The rate of increase is decreasing. I don't know whether, even in theory, there can be a top level of growth for Wikipedia — but we're not there yet. We don't have articles for every mineral, every tree, every elected official. With exponential growth we might have reached that stage in a decade, but with this less-than-linear growth it is not certain that we will ever get there. Wnt 20:03, 24 August 2010 (UTC)
- Entirely true of course, and we will equally have to keep in mind that new topics pop-up every day - some news items such as the olympic games or international tend incidents generate a lot of new pages. However, we should equally keep in mind that the most talked-about article's tend to be created first - When Wikipedia was started we had no pages at all, and masses of new pages could be created to take care of that backlog. Eventually however we will handle more and more of that backlog, while the missing article's tend to become more specialist and specific; For example we may have 1000's of editors who are capable and interested in writing something about the second world war; Yet an article about a notable(But almost unknown) politician is a lot less likely to be picked up by anyone. We need one editor who: a) Actually knows that politician and b) wishes to write about him. Because of this we keep a gap in our coverage, which chances to be filled are much lower due to less interest in the subject.
- As for the growth rate: It isn't that bad to be honest. Between 2006 and 2007 we went from, say, 900.000 to 1.6 million article's (+700k). 2007 to 2008 was 1.6 million to 2.25 million or so (650k) and 2008 to 2009 saw 2.25 to 2.7 million (+450k). Sure, it is decreasing, but there isn't an infinite amount of subjects to write about, nor should we assume that specialist article's are created at the same rate as mainline article's. Eventually we will flat out at a creation rate that is substantially lower then today, with an extremely high (in comparison) rate of "new" subjects with little subjects that stem from the backlogs. In other words, someday we will need specialists to fill the gaps and people who create new article's about new notable subjects. Old article's may still be vigorously updated, but they don't count as creations.
- If anything i would say we cannot really conclude anything on these numbers as there are many factors involved. For example - if our amount of visitors would have increased 10 fold between 2007 and 2008 a decease in edit count would have been more worrisome then when we see a decrease in visitors. Ergo: We cannot create a causal relation that stems from one factor - there are loads of possibilities and parameters, all of which may influence edit and creation rates. Excirial (Contact me,Contribs) 21:52, 24 August 2010 (UTC)
- New articles are also created as topics split up in the "summary style". One starts with an article on Barack Obama but soon ends up with family, early life, local politics, congress, campaign, presidency, Birther urban legends and so on. I don't think that we should have in our mind that there is a wall around us of what is "encyclopedic" - we're already through that wall. We have articles about video game characters and Playboy centerfolds. You say we can't have an article about every person, but commercial sites like Ancestry.com do it. Is there something about that information that we should accept that such public records must always and forever only be held by private companies under pseudo-copyright and be available only for a fee? The longer we wait to step on their toes the more prepared they're going to be to offer the Mother of All Battles when they see the public domain people coming their way. Wnt 17:41, 25 August 2010 (UTC)
- Video game characters and playboy centerfolds generate plenty of media coverage and are therefor notable. Keep in mind that all we ask is that a reliable external source has published an extended article about the subject of the article. This an extremely broad criteria which may include many topics that many people wouldn't even know, and yet it is the core of both the verifiability and notability policies.
- I see you mention Ancestry.com, but i would point out that Wikipedia has a different mission then this site. Ancestry.com's goal is documenting entire genealogical trees, which means that every living being is accepted (And in that perspective valuable information). Another example is Archive.org, which attempts to index copied of every webpage. Could we do the same? Perhaps - but it simply isn't our goal. Wikipedia isn't intended to be face book or another social-media site where people can write long pieces of text about themselves. We cherry-pick items that have some historic importance (Or at least enough notability that other people write about it) and allow article's on them. But, as interesting the discussion is, we seem to be drifting away a bit from controversial content into the area of notability. Excirial (Contact me,Contribs) 19:15, 25 August 2010 (UTC)
- New articles are also created as topics split up in the "summary style". One starts with an article on Barack Obama but soon ends up with family, early life, local politics, congress, campaign, presidency, Birther urban legends and so on. I don't think that we should have in our mind that there is a wall around us of what is "encyclopedic" - we're already through that wall. We have articles about video game characters and Playboy centerfolds. You say we can't have an article about every person, but commercial sites like Ancestry.com do it. Is there something about that information that we should accept that such public records must always and forever only be held by private companies under pseudo-copyright and be available only for a fee? The longer we wait to step on their toes the more prepared they're going to be to offer the Mother of All Battles when they see the public domain people coming their way. Wnt 17:41, 25 August 2010 (UTC)
- Your curve is going up, but its inflection point is in 2007. The rate of increase is decreasing. I don't know whether, even in theory, there can be a top level of growth for Wikipedia — but we're not there yet. We don't have articles for every mineral, every tree, every elected official. With exponential growth we might have reached that stage in a decade, but with this less-than-linear growth it is not certain that we will ever get there. Wnt 20:03, 24 August 2010 (UTC)
- Keep in mind that a complex subject rarely has a singular explanation or cause associated with it. Based on your comment i would point out a few things:
I want to give a comment related to the images which Excirial presented. English Wikipedia did exactly the same which Margaret Thacher did with UK economy after the 1970s crisis: it has become more conservative and it underperformed growth possibility. --Millosh 01:02, 26 August 2010 (UTC)
I see many articles at en.wp which were merged into bigger ones, after which those articles have lost traces. After that, we've become more conservative with BLPs. OK, we can say that it is reasonable that we should be conservative in relation to the living people, but it significantly contributed to the more conservative atmosphere at Wikipedia. After that, or around that time, de.wp introduced flagged revisions. If it is about product -- as German Wikimedians say -- it is a good idea. However, at this moment (actually, at that moment), we don't need to care about the product. We need to care about our own community. Removing possibility to see their own mistake (or their "diversion") is very inhibiting to the potential editors. --Millosh 01:02, 26 August 2010 (UTC)
And now we are talking about "controversial content". I can expect that Wikimedia finish like kakapos, as Douglas Adams explained in his talk. The only known method which we know to reproduce is to become more conservative. But, in this moment of time, being more conservative means being closer to extinction. --Millosh 01:02, 26 August 2010 (UTC)
As far as this process is evolving, I am closer to the conclusion that the best idea is not to do anything at all. Or, better, I think that the product of this process should be the nicely worded sentence "if you really want your own Conservapedia with all Wikipedia articles, please make your own Family Friendly Wikipedia". While I highly appreciate Robert's involvement, I think that the motives for involving him are ill-fated. I really don't give a shit what Jimmy's super-rich friends think about Wikimedia. I care about Wikimedia. And this is the right path for making Wikimedia projects irrelevant. --Millosh 01:02, 26 August 2010 (UTC)
- NotCensored is, to me, part and parcel of NPOV. A project written from a truly NPOV cannot be censored to 'protect' its readers. A "family-friendly" project, meanwhile, can never be truly written from a culturally neutral point of view. The two approaches are absolutely irreconcilable. We chose NPOV here.
- The beginning of any content dispute is the recognition that we are free to use controversial content if that is what makes for the best possible article, and we are free to NOT use controversial content if it would make for a worse article. And then, starting from that foundation, the wiki-process begins. IT works because it's gives as much latitude as possible to editors who are actively on the scene trying to make the best article possible. There is no valid reason to abandon NPOV and the wiki-process now. --Alecmconroy 12:20, 27 August 2010 (UTC)
Inclusion decisions should be made solely by preferring actions most likely to advance the Foundation's Mission, including external factors influencing the success of that mission. If showing images of religious figures prevents a superstitious person from learning the truth about biology, then the inclusion of such images should be questioned. We would do the same with textual statements which called known physical, chemical, or biological truths into question. 71.198.176.22 21:52, 29 August 2010 (UTC)
2. Is it fair and/or legitimate to compare policies and practice around the selection of images for Wikipedia articles with policies and practice on Commons for image selection in that project?
edit- It is always fair to see what the different rules are, but I think you mean harmonize them in some fashion, and this is not legitimate. DGG 19:00, 23 August 2010 (UTC)
- Many article's, Many different Wikipedia's, Many rules. In anything i see commons as a top-layer project which provides images for all different Wikipedia's. If we censor or restrict the top-level project this will essentially enforce this policy for each and every Wikipedia. If anything i would say we allow each project to select the content they do not wish to allow (Within limits of the 5 pillars and law requirements of course). That way certain communities can restrict content while not hindering other communities from allowing it. Excirial (Contact me,Contribs) 19:18, 23 August 2010 (UTC)
- Wikipedia has the limited scope of being an encyclopedia. Commons has the limited scope of being an educational resource. Though these are related, they are not the same: Commons can and should offer a deeper pool of images than are needed to illustrate even all the Wikipedias in the world. The proper scope of Commons may still be somewhat debatable, but the place to discuss this is COM:SCOPE. Wnt 19:29, 24 August 2010 (UTC)
- Commons' mission is far broader than English Wikipedia's. Commons is a repository, a photo archive. Assuming funds and resources were available, there is absolutely nothing in theory preventing commons from hosting every single free-licensed we can get our hands on-- there is no reason it shouldn't, with time, grow into being a Free-Licensed Library of Alexandria. --Alecmconroy 12:27, 27 August 2010 (UTC)
Policies of both should depend entirely on what more likely advances the Foundation's mission, which is the reason for producing a free encyclopedia and collecting media to begin with. 71.198.176.22 21:23, 29 August 2010 (UTC)
3. There has been controversy in the past (and as recently as the dispute this summer over the Acehnese Wikipedia templates) over the question of how far local project autonomy should extend its ambit over basic Wikimedia principles. Is it possible to state a policy in this area definitively, or should each individual case be evaluated on its own merits? And if there were to be a policy (or set of guidelines), how might it (they) be stated?
edit- it is not possible to have a definite rule for how to interpret the common principles; any such rule would inevitably need its own interpretation. DGG 18:58, 23 August 2010 (UTC)
- Wikipedia's based rules are mostly a definition of what it means to be an encyclopedia, and the top-level rules are exceptionally few. Projects have a lot of lenience in forming their own rules and autonomy, but they should have no autonomy in aspects that describe the basic aspects of an encyclopedia. For those unaware what the "acehnese template" issue is, here is a link. Excirial (Contact me,Contribs) 19:18, 23 August 2010 (UTC)
- Opposition to censorship is one of Wikimedia's founding principles. It follows that any imposition of censorship by WMF on its member projects, that is not done under true legal duress, would be a very severe deviation that would call into question whether WMF has founding principles, or what they really are. That said, the WMF has only a limited ability to uphold free communication on a site if a majority of volunteers oppose it, and generally, deleting the Wikipedia in a given language would be a more far more serious destruction of content than acknowledging a temporary defeat on a limited issue. While such circumstances may apply to Muhammed drawings on a Wikipedia created largely by Muslims, there is no such issue regarding sexually explicit images on English Wikipedia, which certainly face no strong consensus of opposition.
- Though I am completely unfamiliar with this issue, I don't think that forcing the Acehnese Wikipedia to remove their protest could possibly be the right answer. I would rather have seen WMF require, at most, a link to issue a response or (better) to invite people to discussion. As Westerners we find fault with Islamic culture for resorting to violence rather than reasoned discussion, yet when such discussion was offered, WMF ran away from it. That can't be right. So far as I know the Islamic prohibition on Muhammad images is a matter of idolatry, but us heathens have absolutely no idea of idolatry in our heads when we look at them. We should have had a chance to talk things like that over, and try to start a civilized way of showcasing this difference of opinion. Wnt 19:52, 24 August 2010 (UTC)
- If any large Wikipedia had set up a protest against the Acehnese Wikipedia, the sheer mass of editors could have forced whatever content on that Wikipedia they wanted. I fail to see why the reverse would be okay.--Prosfilaes 23:15, 29 August 2010 (UTC)
Although operated (largely) independently, the individual wikis are each part of the Foundation's mission and under its umbrella. So there will be basic principles that all WMF-hosted reference content must comply with (whatever those principles might end up being). Numerous examples already exist - privacy policy, NPOV, and others listed at Founding principles. To the extent that each wiki community is a part of a larger community of all Wikimedia users, there are principles decided that individual wikis cannot actively undermine. For example Palestinian Wiktionary cannot have a banner advocating vandalism to Hebrew wiktionary, nor can Flemish Wikisource renounce WMF privacy policy and declare that user IPs should be public.
I would suggest that "undermining the goals of the project" is a basic no-no. While WMF is not a corporate, imagine one part of any organization was formally taking steps to undermine (rather than seek change of) the fundamentals of the whole. Not okay. A banner by one wiki advocating action against other projects is within that category, it undermines the goals of the project and misinforms new editors and visitors as to the project's ideals and approaches. The question that might exist is whether the images concerned are of educational value and free of prohibitive copyright issues. One might legitimately do a number of things, but a blanket conflict with mission and goal and basic principles, and undermining the project as a whole, are not viable options. As to what such a list of "basic principles" might look like, I would suggest starting with those things without which the mission of WMF's wiki community cannot be fully undertaken, then those things where choice exists but we have chosen one way out of multiple alternatives as a community-wide norm (we could allow edit warring but are unlikely ever to do so), and for the rest ask the community itself what it considers "basic principles" - ie what matters can no wiki abjure without prior WMF or wider community agreement, and what items can a community abjure but should not and hence a very high barrier should exist to doing so. FT2 (Talk | email) 09:12, 25 August 2010 (UTC)
If someone wants to have a wiki according to their own POV -- he, she or they could use, for example, Wikia. If some culture is not able to accept one scientific source of information and its principles, they should consider other sources of information and other places on Internet. --Millosh 01:18, 26 August 2010 (UTC)
- Local project autonomy is essential to making Wikimedia a viable wiki host. More than anything else, each wiki has to have a 'community', and the true authors and workers here are 'the community'. And a community HAS to be able to decide for itself how to run the place.
- At En.Wiki, we decided to do the ultimate in modern free speech, we included image of Muhammad. But that's En.Wiki's decision, and even though I'm a huge supporter of that decision, I have absolutely zero desire to force our Acehnese community to do likewise against their consensus. What's right for En.Wiki may not be right for the Acehnese Wiki. I think removing the muhammad images would be censorship and a non-neutral point of view-- but if the Acehnese community truly wants to go that way, by all means, let them!!
- In short, as strongly as I feel about anti-censorship, I feel even more strongly about local-community-governance. Jimbo's "Porn Putsch" was so devastating because it tried to abolish BOTH principles at the same time. --Alecmconroy 12:38, 27 August 2010 (UTC)
- But en.WP doesn't do the ultimate in free speech; it's usually more conservative on pictures of sexuality then many other Wikipedias, including the Farsi WP.--Prosfilaes 23:15, 29 August 2010 (UTC)
Each would need to be decided on a case-by-case basis by IP address, which is as close as the Foundation can reasonably get to geography and language preferences. If you instituted such rules, you might run afoul of the privacy policy which prevents the use of language, geographical, or other identifying information in all but very tightly defined circumstances. And you would need global default rules first. Here are some examples: The Convention on the Rights of the Child shall be obeyed, the Declaration of Human Rights shall be obeyed, the Foundation's Mission shall be supported, the Gini coefficient shall be improved, pollution shall be reduced, recycling technologies shall be increased, health care, internet access, telephony, plumbing, building safety codes, roads, transportation, and other infrastructure shall be improved, freedoms should increase, vital intellectual property will be moved into the commons, etc. 71.198.176.22 21:24, 29 August 2010 (UTC)
4. Individual Wikipedias are organized by language, not necessarily by geographic and/or political territoriality (although often they coincide). However, language itself is a key creator of cultural norms and attitudes, so each language Wikipedia inevitably carries a certain cultural/political orientation. Do we need to put any mechanisms in place to navigate the content differences these differing orientations force upon the projects?
editRobertmharris 13:39, 23 August 2010 (UTC)
- could you please explain what you mean by "navigate"? Do you mean harmonize, or do you mean to attempt a chart comparing the different interpretations? DGG 19:02, 23 August 2010 (UTC)
- I think what I meant was more in the "harmonization" realm, but your idea of charting these differences is also valid. I guess I was wondering if it is perceived as a problem that different language Wikipedias might have slightly (or more substantially) different articles about the same subject, and whether there was any way these differences might be brought into a relationship with each other -- "harmonized" I guess is the right word for that. Robertmharris 03:02, 24 August 2010 (UTC)
- I think it would be acceptable to (optionally) place a "translation" link beside any one of the various language links that are offered to the left of an article. Thus the French Wikipedia should, if volunteers desire, make available an English translation of one of their articles in some standard location (Maybe a namespace "En:<article name>"). That way one person might do the work of translating the article, then others could perform the task of integrating its content into the English Wikipedia, and debating against whatever cultural biases we have in the process. Meanwhile readers could click next to the "translation" by the "fr:" link at the left, and see for themselves what the latest translation of the French Wikipedia says and whether it looks different from ours. I would not suggest any method of harmonization less passive than this - and I recognize that this would probably only be done for a handful of articles. Wnt 19:59, 24 August 2010 (UTC)
- There are JS gadgets btw, that put Google translate links in the Languages section. See en:User:Manishearth/sidebartranslinks#Wikipedia_interwiki_translator. It is rather useful. TheDJ 21:21, 25 August 2010 (UTC)
No. This is the path for sanctioning POVs. And Wikipedia has its own place in the world just because of NPOV, whichever its interpretation is. While we are talking about interpretations, we are more or less safe. If we start to talk about "many NPOVs" as equal ones, we are at the right path to disaster. Every $%^&*#! sect and faction would have the right to demand "their own NPOV". What we need is more strict scientific approach to all language editions, while taking on count cultural and political issues just if we are forced to do so. To be honest, I would lock any Wikimedia project which doesn't follow scientific norms more or less strictly -- until we get enough of editors in specific language which would be able to do so. We don't need tons of cultural, political or religious garbage, we need knowledge. --Millosh 01:13, 26 August 2010 (UTC)
- I don't propose POV forks; yet the truth is that each language Wikipedia gets its own chance to decide how to write an article. Making the content more accessible to other languages doesn't change that. I think that having meta threaten to "lock" whole Wikiprojects over content disputes would be an unparalleled disaster, and even occasional interference in micromanaging decisions would be damaging. I would just link to the other language sites' translation as an independent resource, just like linking to a Web page - not to try to force consistency.
- Google translate links are cute, and much better than nothing, but much worse than a real translation. Wnt 17:27, 26 August 2010 (UTC)
- No to harmonizing! Every wiki having their own unique editing culture is a good thing. There is no guarantee that the way we do things at EnWiki is the 'best'. Let the individual wikis become 'laboratories of education'.
- If we tried to 'harmonize' rules between large communities that have no common language-- that would just lead the english-speaking leadership 'harmonizing' the situation by just imposing their own standard on a worldwide community. (ala what happened in May). And then you have a total mess-- because the English-speaking leadership can't gauge consensus across languages they can't read. At the same time, the local communities can't even participate in the discussion, since they don't speak english. Harmonization bad, Diversity good.
- This foundation HAS to be like a nursery, allowing each unique species of plant to grow in its own unique way, and making sure each little seedling has enough sunlight and water. That's the attitude you have to take-- the foundation is here to help projects grow, not to judge the beauty or usefulness of what those projects grow into.
- Just let each project do their own thing, and if it's messy, it's messy. That's the 'wiki' way-- we make a big giant mess while simultaneously trying to improve it. We encourage boldness at the edit level and the article level. We should start becoming open to such diversity and freedom at the new-wiki level also.
- Obviously, there does have to be _some_ limit to local project autonomy-- I would suggest "noncompliance with laws in server nation (US)" as the one really obvious justified reason to override a local community's autonomy. --Alecmconroy 12:49, 27 August 2010 (UTC)
Harmonization would require a chart to begin with. But before you could have a meaningful set of charts, you would first need to have similar charts for the state of the globe and each of its geographic and linguistic subsets you wanted to address. Those subsets might be columns, and the censorship decisions you might want to apply would be rows. Then, people would be able to judge each rule by how well it supported the Foundation's mission and all of the societal, economic, political, and health factors upon which the Foundation's Mission depends in each of the subsets. Depending on the subset selection, the selection decisions for any particular rule would involve tension between two constituencies with differences of opinion about tradeoffs. Therefore, you would need to use complex cost-benefit analyses to make a legitimate decision, and I'm not sure sociology, after decades of abuse for being a "soft" science, is sophisticated enough to support such decisions. Until such time as it can -- which would require a much greater statistical sophistication on the part of social science research paper abstracts e.g. such as those of http://equalitytrust.org.uk -- I would recommend against instituting such rules. However, such rules already exist. "NOT#CENSORED" is a farce in the face of policies like w:WP:NOT#HOWTO which censors an entire branch of knowledge: procedural. What makes semantic and episodic knowledge so much better than procedural? If you want to justify a harmonization bowing to political or religious authorities, you might need to address that kind of entirely unsupported, pointless censorship first. 71.198.176.22 21:54, 29 August 2010 (UTC)
- The "how-to" policy is one of the ones used for censorship by misinterpretation (but not very often at all compared to some others). But all it actually says is that articles should be written as encyclopedia entries rather than as a set of instructions to a reader. There is an advantage that I can see to this, which might be counterproductive to explain here. Wnt 06:15, 3 September 2010 (UTC)
Comment by FT2
editFor what it's worth, some thoughts on non-censorship.
Wikipedia has always been anti-censorship. It's inherent in the idea of "free knowledge". An argument driving non-censorship - which has both proponents and opponents - is that 1/ you can't have "a little censorship" - at some point that "little" becomes too easy to exploit and will be exploited, 2/ the general pattern of human society suggests that both no censorship and censorship can do harm, but censorship seems to do much more, or more pervasive harm.
The recent Wikileaks issue was a good example. In that case there was a major privacy breach capable of doing major harm, an anti-censorship move that could cost lives of many people directly, and (by deterring co-operation or rallying militants) could cost many lives indirectly. Serious stuff, perhaps one of the largest illicit spillings of private material most people know about. Add in other areas where public knowledge can be a risk - nuclear technology, bank and corporate hacking abilities, kidnap and killings, all of thatcan do harm. So indeed harm may arise.
However... those probably pale into insignificance compared to the harm done when people are kept in deliberate ignorance of matters known to others. History is replete with disasters, massacres, genocides, long term health issues, all in the name of censorship, and these routinely affected tens and hundreds of millions.
The scales are more tilted, because the pictures we're discussing are not likely to be a major life threat, and the kinds of material we're discussing is widely accessible anyway. There is room for focus on educational value (site purpose) and for refining norms, but the context underlying non-censorship is worth reflection and if "judgment of an image's value and education purpose" becomes too close to censorship rather than simply quality selectivity, that principle speaks.
While this may not be completely applicable to "should an online encyclopedia have an image of sexual material", the principle's a direct connection. Censorship has its cost. It's not that long ago in the West that women came to marriage in ignorance - and often died from it. It's still kept (or expected to be) that way in many places. I'm not convinced that whipped cream imagery and pictures of some random guy's genitals are life saving, but some other imagery may well be, and the mere act of showcasing non-censorship may as well.
Sexual, Mohammed, and other imagery that some would like removed may be education and knowledge, perhaps even life, in some circumstances. Removal for what is essentially a reason of fashion (personal dislike, beliefs and views, or norms of a given culture, rather than any more substantive reason) is far from being a trivial decision.
FT2 (Talk | email) 23:48, 24 August 2010 (UTC)
- See Ting Chen's posts to foundation-l for a cogent explanation of why discussion of 'censorship' in this context is probably the wrong focus. Privatemusings 22:08, 25 August 2010 (UTC)
- I just want to comment that I don't see Wikileaks as having caused harm, even by revealing the informants' identities. The reason I say that is that, according to its own story, the U.S. military allowed a soldier, broadly critical of the war, demoted for fighting, facing early discharge, openly violating the military's anti-gay regulations, to have unrestricted access to hundreds of thousands of documents about countries all over the world on both the SIPR and JWICS networks having nothing to do with his job description without so much as tracking the downloads and asking what he's up to. And the first they know about it is when it is published on Wikileaks. Now when you post the names of your confidential informants to the Internet - and when your "top secret" internet can be tapped into at thousands of places all over the world without your knowledge - well, that's not secret. If Assange hadn't published the names, they would still be knocking around in the databases of Russia, China, France, Israel ... basically, any country that knows how to plug a cable into the back of the computer. And one of them would have told the Taliban about the stuff sooner or later, to get a hostage released or a pipeline left alone or a shipment of opium for some extra funding. And those informants would have just turned up dead, no comment, and it might have been fifty years before someone noticed that a lot of them had talked to Americans. So I see Assange as a life-saver, not a killer. And while this case may be a digression, the moral for Wikimedia is to scrutinize any claim made that our information is harmful. Because such claims are wrong. You may not know how they're wrong, but believe me, they are. Wnt 17:29, 25 August 2010 (UTC)
Comment by Jmh649
editWhile we have a lot of great policies they are not always applied as thoroughly as they may need to be. People may be blocked for civility issues, however continuously misrepresenting sources and pushing for censorship by persistence rather than consensus is common. We have example at Rorschach test, Abortion, and Suicide among others. Wikipedia is in need of a method of providing editorial overview and enforcing this opinion if needed. On some issues the two side will never be able to form a compromise ( with respect to the Rorschach test one side wishes this information removed well the other does not ) there is no middle ground.Doc James (talk · contribs · email) 18:46, 25 August 2010 (UTC)
- Consensus can change, so the persistence of debate is not automatically unhealthy. I've just looked at the newest Rorschach RFC, and it does not seem like a serious threat compared to previous attempts. If we need a method of editorial overview, this should not be some new or existing arbitrary executive power, but a constitution enshrining agreed-upon versions of certain core policies, which all sides would recognize needs extra effort to overturn. Wnt 17:43, 26 August 2010 (UTC)
How are you measuring controversy? One measure is the size of the talk page archives. The Rorschach test (censorship with commercial interest behind it) and Abortion (widely acknowledged as very controversial) articles have much longer talk page archives than Suicide, but then again the Torture article has even shorter archives, so I'm interested in better measures than talk page activity. 71.198.176.22 23:13, 29 August 2010 (UTC)
The Librarians's Perspective
editFor your information, here is an email exchange I had earlier this summer with Nancy Black, Manager, Access and Information,University of Northern British Columbia, on the policies of libraries around accession of and access to materials, especially concerning kids. In the library community, the key concept is something they call Intellectual Freedom (IF), and they've been discussing this concept and its ramifications for many decades (some of the key policies in this area were enunciated in the 1950s). You can see that, by and large, the library community is very concerned to preserve and protect IF, although there have been, and continue to be, many challenges to their efforts in this regard.
To: Robert Harris From: Nancy E. Black Re: questions of access Date: June 9, 2010
Responses to questions from Robert Harris about issues of access, intellectual freedom and censorship.
1. Is there an official Canadian Librarian Association policy or general agreement on the following issues (about library collections themselves)?
Yes, most library associations (Canadian, provincial, USA, the UK, Australia, New Zealand, and so on) have position/policy statements that speak to the principles of intellectual freedom and censorship. I have less knowledge and familiarity with practices in other countries – although there are reports of librarians in more – shall we say – repressive regimes who have risked their lives to hide/protect books that may be considered “dangerous”.
Very broadly speaking, librarians support the principles of Intellectual Freedom (IF) and do not advocate censorship. We believe that these principles are not only foundational to the profession, but are also foundational to the principles of democracy. These policies inform policies and practices for libraries on the practice of collection and access to information. Some public libraries will also provide access to these policies and practices through their websites. Selected Association policies can be reviewed at:
CLA (www.cla.ca) Position statement on Intellectual Freedom (other position statements that overlap with this principle can be viewed through the CLA webpages as well) http://www.cla.ca/Content/NavigationMenu/Resources/PositionStatements/Statement_on_Intell.htm
ALA ( http://www.ala.org/) Look at the Issues and Advocacy section http://www.ala.org/ala/issuesadvocacy/index.cfm links that include a number of position statements. Under the heading of Intellectual Freedom, you will see various links to: the Library Bill of Rights, Filtering, Censorship. ALA also has a code of ethics – which also overlaps with IF http://www.ala.org/ala/issuesadvocacy/proethics/index.cfm
International Federation of Library Associations (IFLA) http://www.ifla.org/ also has a statement: http://www.ifla.org/en/faife
2. How do libraries make decisions on acquisitions of material?
A number of factors come into these decisions: purpose of collection, community served, subject matter, gap in collection on that particular topic, perspective of content, age appropriate, budget, policies, guidelines, practices, type of library (school, public, academic, special). Typically librarians rely on reviewing journals to make selections (Choice, Quill and Quire, School Library Journal, Times Literary Supplement, VOYA, Books in Canada, Publisher Weekly, Booklist to give a few examples); we also rely on our own knowledge of a particular subject area/discipline, familiarity/knowledge with a genre of literature, and knowledge, understanding of the groups and their needs/interests that we serve. Typically most libraries have collection development policies that also guide selection and collection practices. Such policies usually make reference to the principles of the IF position statements and especially for public and school libraries, it is important that such policies be endorsed/supported/approved by their respective boards. Research in our discipline has shown that those libraries without a board endorsed policy (especially true for school libraries) are more likely to lose cases where/when there have been challenges to books and are more likely to be forced to remove challenged books from the collection.
3. Do questions of cultural sensitivity and sensitivity about depictions of sexuality play a part?
Yes and no. I suppose that some librarians grapple with this more than other librarians (especially school librarians) and some librarians are less comfortable with selecting such materials for their libraries. Research has shown that librarians are susceptible to censor in their selection practices – in other words – some librarians will choose not to purchase items that they know or suspect may be challenged by their communities. However, broadly and generally speaking, in principle, librarians should select materials even though the information may be unconventional and even though some members of the community may find the information offensive. It is important for librarians to select materials that represent all points of view, and to recognize that libraries are the cultural storehouse of a society’s memory, that the information collected is reflective of a society/community and that libraries, as such should be neutral and non-judgemental in their selection practices and in the development of a collection. The understanding is that librarians do not monitor what others read, and that it is up to parents, not the librarians, to monitor what their children read. (There was quite an interesting case about 10 years ago where parents of the Surrey School System challenged picture books dealing with homosexuality – the title of one of the books was I believe “King and King”.)
4. Are there various levels of display and accessibility of materials in most libraries?
I suppose in the sense that in public libraries and school libraries materials are arranged according to age groups: hence the picture books in the children’s areas, the teen age books (Young Adult – YA) in the teen (or YA) areas, and so on. But this (or should be) is more about reading level rather than sensitivities.
5. And are these decisions about accessibility made on the basis of potential cultural sensitivities or sensitivities around sexuality (I'm talking in general here, not a policy about kids). In other words, are there certain books that are restricted in some way because of their subject matter?
Ideally, no – this should not be happening; but there are always some librarians out there who are much less comfortable and knowledgeable with the principles of IF. It was quite interesting when the Madonna book (Sex) came out a number of years ago – various libraries had various responses to this book. Content aside: the packaging, format, metal covers, extremely poor paper quality, no cataloguing information (most books provide cataloguing information as a guideline for cataloguers and to ensure consistency of cataloguing practices) represented a number of challenges for libraries as to what to do with the book. Knowing that it would be extremely popular – the metal covers and poor paper quality – meant that this book would simply not stand up and would ruined in very short order – this posed a preservation dilemma and, believe or not, there were a number of articles about the “problem” of shelving, preserving, and ways in which to provide access (or not) to the book. There was one tongue in cheek article written about uses for the metal covers after the pages of the book had been destroyed, such as using the covers as food trays, or like a toboggan. A number of libraries felt compelled to restrict access – simply to preserve the book, never mind the content. Some libraries choose not to purchase (mostly because of the hype), some purchased several copies and then when they were ruined – simply did not re-place; some purchased but restricted access and borrowing – that is placed the book in the Reference section – which is traditionally non-circulating. I know of one library who kept the book in a glass case and every day, turned one page of the book – this served to allow everyone and anyone to “read” the book, but at the same time, preserved it. Eventually, when the hype was over, they added the book to the collection.
There are some librarians who believe it is their ethical duty to “protect” some readers from some material and will take steps to either not purchase an item, or shelf it a place that makes the item less accessible. It has also been discussed in our literature that certain cataloguing practices (especially with fiction) make items less easy to find in the catalogue by subject searches – which of course affects accessibility (see Rothbauer in the reference list below).
6. Do libraries have a policy about demarcating volumes within the library itself -- in other words, some sort of system that identifies certain volumes as potentially objectionable, or safe?
No – this practice is definitely frowned on and should not be happening. This is counter to our principles, philosophies and practices. The idea of “rating” is abhorrent to many librarians – and really who is to decide? Rating and filtering only create a sense of false security. We work from the understanding and position that people should be able to make their own decisions; parents should monitor children’s reading. If this practice does take place in a library, it would be because the staff are misinformed, misguided and sadly unaware of IF principles; or in the case of school libraries, perhaps forced to do so by administrators who have very little understanding of IF principles. On the point of “safe”, see the Bernier article from the reference list below.
7. Are computers in libraries restricted in any way (ie filtered)? Is their use monitored (officially or unofficially)?
Some libraries (sadly) have felt the need to filter computers used in children’s areas of libraries. Of course the irony is that filtering is unreliable and typically only serves to keep out “legitimate” information (information about breast cancer, for example, might be filtered because of the word “breast:) and the “questionable” stuff still gets through – because the creators of that stuff have figured out how to beat the filters.
I suspect that unfortunately, some unofficial monitoring does go on. I am not able to provide any statistics on this. Public and school libraries in the States were forced into this very awkward situation a few years ago by the Children's Internet Protection Act (CIPA), which these libraries to use filters in order to receive funding – some libraries refused to do so (see the Wyatt article in the Reference list below – a position with which I do not agree, but sadly, I suspect represents the position of a number of librarians, also see the Curry & Haycock article).
8. Do individual community libraries have the right to create policies for their own communities, different than those for other communities (because of the cultural makeup of certain neighbourhoods, for example)
Yes, most libraries will develop policies and practices reflective of communities served, so, for example providing books in a particular language if members of that community are predominantly of a particular culture. Having said that there is still an interest in developing balanced collections reflective of various perspectives, and the IF position statements would still form the underlying principles of practice.
Now, here are a few questions specifically about kids (however they may be defined -- the legal definition is under 18, but that's clearly inadequate)
1. Are kids generally allowed free access to libraries, equivalent to that afforded adults?
Yes, as mentioned areas of the library are developed for the age groups, but there are typically no restrictions that prevent people from accessing/using materials throughout the library. Many libraries have a “child’s” library card as opposed to an “adult” library card, but this has more to do with amount of overdue fines (lower for a child) and might have something to do with the numbers of items a child is permitted to borrow, and less to do with access to the adult areas of the collection – although historically/traditionally, the “child’s” card was a method to limit access. I should also note that libraries also maintain confidentiality and privacy of patron records; libraries only keep a record of items currently checked out to a patron – if someone asks for patron information – this would not (or should not) be provided (hence a great deal of consternation for American librarians with the advent of the PATRIOT Act – which hearkens back to the days of the cold war and McCarthy era when FBI agents tried to obtain patron information from librarians). When the items are returned, this record is then deleted. Whenever a patron (regardless of age) puts in a request for a book and the library calls the home (or sends some kind of message) of the patron to let the person know the item is in – they will not ever give the name of the item – they only state that there is an item on hold for the individual.
2. If yes, are policies on acquisition of materials adjusted because of the fact that those materials might be accessible to kids?
No, they shouldn’t be.
3. Is the distribution of certain volumes restricted (in stacks where the volumes have to be specifically requested, for example) because of sensitivity to potential viewing by children?
No, they shouldn’t be.
4. Is the use of computers by kids restricted or monitored in any way (eg, if a library staff member sees a child surfing porn on the Net on a library computer, do they intervene? Do they intervene if they see an adult doing the same?)
No, they shouldn’t be; but again, some library staff do not understand the principles. There was an interesting case a few years ago when some library staff at the Ottawa Public Library brought forward a harassment suit to the administrators because some staff were upset and angry at being subjected to what they believed was objectionable information being viewed on the internet by some patrons. I don’t recall all the details now, but one of the remedies was to change the position of the computers and to add privacy screens. In my opinion this also speaks to a training issue and I believe that all library staff should understand IF principles and should know how to respond appropriately to various situations.
5. Are there resources specifically designed for children, and are children "forced" to use these resources as opposed to any other in the library.
Yes, there are resources on all topics that are geared for children; but children are not “forced” to use those as opposed to other resources that might be in other areas of the library. I suppose in school libraries, they are “forced” to use those resources, mostly because materials for adult library patrons would not be collected as part of the school library collection.
6. Finally, how do librarians generally view Wikipedia as an appropriate site for either adults or children? Is its use encouraged, discouraged, or neither. What might Wikipedia do to improve its reputation among librarians as a suitable reference point for both adults and children?
This is an interesting question, and I’m not really equipped to answer it. I have been working in an academic library for the last 14 years so, I’m not sure how public or school librarians view Wikipedia. I suspect there are wide ranging views/perspectives on this. Broadly speaking, I know that many librarians do rely on Wikipedia to help answer reference inquiries and probably direct people to that source. I know that my own children were directed to use Wikipedia as reference source, but not as an only source of information. In the academic environment, I use this source myself, and I do direct students to the source, but I do suggest/advise that Wikipedia (depending on the question or information need) is a good starting point and that people should look at the information, the sources provided, and if additional information is needed, to compare what they find with the information in Wikipedia to analyze for gaps or inconsistencies in the information – I advise that Wikipedia is a tool. In terms of being concerned with children accessing Wikipedia for what might be considered “questionable” or unconventional or “offensive” information, I would still take the position of IF principles and would hope that my colleagues would do the same.
As for the other part of your question: how to improve the reputation of Wikipedia among librarians – I don’t know an answer to that and can’t really answer on behalf of the profession. Speaking personally, I think that continuing with the practices currently in place around editing and so on, will in the long run build on the reputation. If I have other thoughts on this, I’ll let you know.
Just picking up on your comment about “how to deal with community complaints about the presence of potentially "objectionable" volumes within a library.”, most libraries will have a collection policy in place and most libraries will have a process or procedure for how to deal with a challenged item. Research has shown that libraries that have such policies, procedures and practices in place are more successful at overcoming such challenges.
I’ve included a few references below that you might find interesting. The topic of IF and censorship is a research area/interest of mine and I’ve followed the discussion for several years. My IF “guru” is Dr. Ann Curry.
References:
Bernier, A. (2003). The case against libraries as “safe places”. VOYA, 26 (3), p.198-199.
Boon, M.H., Howard, V., (2004). Recent lesbian/gay/bisexual/transgender fiction for teens: Are Canadian public libraries providing adequate collections? Collection Building, 23, p. 133-138.
Curry, A. (2005). “If I Ask, Will They Answer?” Evaluation of Reference Service to GLBT Youth. Reference & User Services Quarterly, 45 (1) p. 56-65.
Curry, A. (2001). Where is Judy Blume? Controversial fiction for older children and young adults. Journal of Youth Services in Libraries, 14 (3) p.28-37.
Curry, A. (1999). Walking the tightrope: management of censorship attempts in Canadian libraries. In: Petersen, K., Hutchinson, A. (Eds.), Interpreting Censorship in Canada, pp.182-198. Toronto: University of Toronto Press.
Curry, A. (1997). The limits of tolerance: censorship and intellectual freedom in public libraries. Lanham, Maryland: Scarecrow Press, Inc.
Curry, A., Haycock, K. (2001). Filtered or unfiltered. School Library Journal, 47 (1) p. 42-47.
Davies, B., Curry, A. (2002). Literature about illicit drugs: libraries and the law. Journal of Information Ethics, 10 (2) p. 13-39.
Jenkinson, D. (1986). Censorship iceberg: results of a survey of challenges in public and school libraries. Canadian Library Journal, 43, (February) 7-21.
Rothbauer, P. (2004), The Internet in the reading accounts of lesbian and queer young women: failed searches and unsanctioned reading. Canadian Journal of Information and Library Science, 28(4) p. 89-110.
Rothbauer, P.M., McKechnie, L.E.F. (1999). Gay and lesbian fiction for young adults: a survey of holdings in Canadian public libraries. Collection Building, 18 (1), p. 32-39.
Schrader, A. (1996). Fear of words: censorship and the public libraries of Canada. Ottawa, Ontario: Canadian Library Association.
Wyatt, A. M. (2006). Do librarians have an ethical duty to monitor patrons’ internet usage in the public library? Journal of Information Ethics. 15 (1), p. 70-79. [note: I do not agree with the argument in this particular article, and I consider the argument to be flawed – I include this article because of the perspective] Robertmharris 11:29, 1 September 2010 (UTC)
- http://elementarylibraryroutines.wikispaces.com/Elementary+Librarians+on+Twitter may be useful if you need to bounce ideas around. Exposure to superstition is more likely to harm readers than exposure to expressions of sexuality, although both are likely to distract, delay, and sometimes anger some of them. Keep calm and carry on. 71.198.176.22 02:53, 4 September 2010 (UTC)