User:Kbrown (WMF)/TM prototypes/tabs/H10
H10: Image-based problemsEdit
In some cases, users can be the victims of harassment involving images. This can come in many forms, but all have the potential to be upsetting or intimidating. Some examples of how images might be used to harass a user include:
- An attendee of a Wikimedia event being photographed without their consent, with the resulting photographs posted to Wikimedia Commons or a sister project;
- Images of pornography, violence, racisim or otherwise shocking images are sent to users;
- Editing a user's photograph, for example to combine it with a shocking or controversial image;
- Using a user's image to illustrate an article that has negative connotations (for example, replacing the lead image in the "pedophilia" article with the user's picture)
How and when to get an image deletedEdit
Getting an image deleted can be a complicated process. The most important aspect to consider is whether the image actually breaks any rules. Sometimes the image itself isn't causing the issue so much as an attacker using it maliciously. Commons has some shocking images, or images that might cause distress to users. This by itself is not normally a reason to have them deleted.
A common form of image-based harassment is the posting of images taken of volunteers without their content. At events, there are usually precautions taken to ensure those who do not wish to have their images taken are catered for. This is usually done with stickers or lanyards. In some circumstances, these may be ignored – maliciously or otherwise – by photographers or camera operators at events. It can be distressing to have a photograph linked to a username, especially if the link wasn't apparent before. It's also treated as a form of [Section link to H8|personally identifying information] on the projects.
Note that images used to harass users are often posted "off-wiki". Learn more about dealing with harassment taking place on external websites.
Images on Wikimedia CommonsEdit
Most images found on our projects are hosted on Wikimedia Commons. You can identify if an image is hosted there, rather than on a local Wikimedia project, by the appearance of the "View on Commons" tab along the top of the page. There will also be a note just below the image stating that the file is accessible on Commons.
Wikimedia Commons has a guideline around images of identifiable people which can be useful in situations like this. It states: "The subject's consent is usually needed for publishing a photograph of an identifiable individual taken in a private place, and Commons expects this even if local laws do not require it."
Wikimedia events are usually considered private places. This can be a little more blurry if the event is held in a normally-public place like a library or a university. A well-run event will have either something to sign if you are okay with photographs being taken, or more usually stickers or lanyards to indicate you are not comfortable being in photographs.
Even in public places, country-specific rules exist on consent. These rules can be complicated, and not all are legally binding. Check whether or not the country in which the offending photograph was taken is covered by a rule such as this.
"Selfies" or other images of editors which are uploaded by themselves are fairly common on Wikimedia Commons. Of course, while these are usually fine, users should be aware that images of themselves have the potential to be abused for harassment in the future.
Images on local projectsEdit
Images hosted on local projects are dealt with slightly differently. Policies on this tend to vary by project. On most, the unauthorized posting of someone's photograph counts as the release of personally identifying information.
Images involving minors/child pornographyEdit
Where there is a suspicion that an image might contain child abuse or child pornography, please immediately report it to the Wikimedia Foundation through legal-reportswikimediaorg to alert the Trust and Safety team. Include a URL link to it so that it can be reviewed quickly. Even though situations like this are rare it is important that the material is reviewed promptly – having all relevant information available at first contact helps speed things up substantially. The Trust and Safety team at the Wikimedia Foundation is tasked with reporting and otherwise handling such images.
Even if there's no obvious abuse or suggestion of pornography involved with an image of a minor, it can still be upsetting. There currently is no hard rule regarding the treatment of images of younger editors, uploaded by themselves or by others. Generally it is best to use your common sense – would this image be harmful if it remained? Is this image within the scope of Wikimedia Commons?
- Advise User:K to create an account on the Wikipedia fork and remove it from the offending article himself. Provide User:K with guidance on security and privacy on external websites to prevent this from happening again in the future.
- Advise User:K to contact the Wikipedia fork to request the removal of their image directly, and recommend they look into legal action against the offending site. Provide User:K with guidance on security and privacy on external websites to prevent this from happening again in the future.
- Block User:L immediately, as it's obvious they are the one involved with posting this image on external websites. Provide User:K with guidance on security and privacy on external websites to prevent this from happening again in the future.
- Provide User:K with guidance on security and privacy on external websites to prevent this from happening again in the future, but otherwise take no action.
Click to expand! (click to expand or collapse)
There is not much to be gained from answer C, since while it is possible User:L is involved there's no real way of knowing if that's true. Blocking without process in this instance makes little sense. Answer A might be useful if User:K is willing to get involved with such a community, though it seems likely to make the situation worse. Answer D, to do nothing at all, may not actually not as bad as it may sound – there is every chance the "fork" isn't widely read, and intervening may just make things worse further down the line.