Stable pages if no changes in certain period of time edit

Rather than leaving stability to be defined by users, it seems more reasonable to define the stable version of pie id delicious and you should all eat some. a page as the last one that wasn't replaced for an arbitrary amount of time. For example, a version that wasn't replaced by another version within N hours (24? 48?) would be considered "stable". (This is the idea behind delayed commits).

The nice part is that it doesn't discriminate between anonymous and logged-in users (as much as we all are just dying to do that). It also simplifies the user interface, and gives an objective definition of "stable" (as opposed to "right", "good", or "official").

The downside here is that vandalism that goes unnoticed eventually becomes "stable", so that "stable" does not necessarily imply "good". Also, for new and oft-edited pages, there may not be a "stable" version. --Evan 08:06, 31 Jan 2004 (UTC)

With this idea in mind, I have created an 'Age Tool' which shows the age of each word in an article, e.g. for the Train article. It does this by changing the background colour based on when the word was first added to the article, ranging from 'in the last ten minutes' to 'over two years ago'. As well as making it easier to spot vandalism, this view also helps to show which parts of the article are 'mature' and which are new and/or unstable. For more details and other examples, see my home page. --JohnW 22:27, 26 Apr 2005 (UTC)
That's a good concept IMO but it seems to only require an indication of how long the article has been stable. Five minutes = "unvetted, accuracy unknown", five months = "stable, no recent disagreements or changes". Others in between. This would automatically tag vandalism as making a page unreliable. Would be good to add in a link to stable version but simply indicating instability is a good clue to readers and is probably easy and cheap to implement. Next stop, knowing how many different accounts have edited the page and using that as a review guide, to suggest how well reviewed an article has been. With just one author, it's not really a good idea to assume trust but if 50 have contributed and it is stable, it's probably reasonably reliable. Jamesday 21:32, 9 Mar 2004 (UTC)

The disadvantage of an only time-based approach would be up-to-dateness. It wouldn't work for the en:Current events page for example. Of course the question is if a 'stable' version of such a page makes sense at all.

A combination of both might be nice- anon edits and 'new user' edits going into stable after a certain time or after an experienced full user clicking 'make this stable'/editing it. A short delay for full users is a possibility as well, this could even be a function of some 'experience' value (basically the number of edits that made it into stable).

Interface-wise the only visible change for full users should be the 'make this stable' button (if implemented at all).
The admin interface depends mainly on how configurable these options per-page should be. The simplest setup might be:

  • Show 'stable' or 'unstable' with link to stable?
  • determine 'stable' by (select dropdown):
    user experience [ 50% / 50% ] time
    with options ranging from [ 0% / 100% ] to [ 100% / 0% ].

-- Gwicke 13:59, 31 Jan 2004 (UTC)

How about a simple formula, based on what percentage of a user's edits are reverted by other users? (We might tweak this by discounted reverts from a certain class of user.)
For example, if new user Blatheratian is reverted 30% of the time, he keeps newbie status. But if new user Jim Dandy has no reverts (except from newbies), promote him to preferred status.
My only worry is that some highly motivate user will figure out a way to hack this system, in order to subvert or destroy it. So at first anyway we should not make it automatic but we could use the statistics when considering the granting of sysop rights. --Ed Poor
Hurgh. Why this endless desire to categorize contributors instead of edits? --Evan 17:12, 3 Feb 2004 (UTC)
I think Ed's proposal is the germ of an idea with merit, but it needs further refinement. Maybe we could look at Kuro5hin's system for ranking articles and see if it could be adapted. Other models worth looking at:'s Web of Trust; eBay's ratings system;'s Meta Moderation system. Here are a couple of URLs to articles that discuss "reputation management" on the Internet:
Sheldon Rampton 04:06, 4 Feb 2004 (UTC)
This proposal might be a nice stopgap measure but is relatively easy to circumvent. Make many pointless submissions which change whitespace or something. These can easily be done 100 times quickly, or by an automated system. See my comments below.Logicnazi 01:08, 20 Sep 2004 (UTC)

I have some major problems with this idea. The first being that no one really cares whether a page is 'stable', in the sense of not changing, and thus it simply doesn't advance the point. People don't care whether the page has had minor edits recently nor is it really important *when* the page was last changed. In fact if the page was recently changed because of a major factual error, or because a subtle vandalism had gone unnoticed the user most certainly doesn't want the stable page. We need to be clear about what we want to achieve before we do anything. I suggest that there are three valid aims of an anti-vandalism project/stable version.

  1. Reduce the amount of vandalism visible to users.
  2. Allow some users to access pages with a greater guarantee of being accurate/vandal free/polished.
  3. Keep the time required by users or sysops to rollback vandalism low.
  4. Pages are kept uptodate (if the page users see to look up information is too out of sync with the page they are allowed to edit we undoe a basic wiki tenet).

The notion of a stable version was an attempt to satisfy 2 but the name unfortunatly gives the idea it is lack of change we are interested in when it is really some sort of accuracy/guarantee of quality. I challenge someone to figure out why we want to know what pages haven't been changed in a long time except insofar as we think it is a good predictor of quality. I suggest a three tiered structure of pages, non-validated, validated, and published.

Any edit of a validated page by an anonymous user or a user of insufficent 'reputation' leaves the validated page unchanged and produces a non-validated alternative. Whenever the wiki has a non-validated version of a validated page (presumably pages still in a rough outline won't have a validated version) a link asking online users of sufficent reputation to validate pages (present them with the two version differences highlighted and a simple yes, no button). When the page is validated it replaces the previous validated version. Assuming most regular wiki users are helpfull enough to click on these links with reasonable frequency valid anonymous contributions will make it to the valid page in just about the time it takes for someone to skim over the change (this isn't the time to check for subtle factual errors just to make sure someone isn't vandalizing the page). The problem of merging page changes given simultaneous edits (i.e. another anonymous user trying to edit the validated page before the first anonymous users changes have been validated) should be no worse than it is now since the time to validate a page should be small compared to the time to edit that page. A users reputation is determined by the number of validated edits he has made and the number of rollbacks with malice he has recieved (i.e. the person making the rollback can check whether or not the username in question was being intentionally distruptive). This of course can be enhanced if needed with systems of meta-moderation and so forth.

Under this system crediting something as 'published' would work similarly. Pages submited for publication (only availible to loged in users?) the differnces from the prior published version would be similar to the case with validation. However, while one expects most validation reviews to come from users requested to moderate publication reviews would most likely come from regular contributors in the area. For instance everyone visiting a validated page which has a version submitted for publication would see a link allowing him to view, comment and *vote* on the publication of this page.

Before everyone (well everyone who made it this far ;-) ) objects to this "very restrictive"/violating the wiki spirit system let me say that I'm not convinced this system should be implemented *yet*. It does have some overhead (in time spent doing review) and it isn't clear that the vandalism problem is sever enough to warrant this. However, as any hope of creating a publication version seems to require a similar reputation system I think it is the right direction to work towards. Almost by definition no one relatively untrusted user should be allowed to change the published version. Therefore some system of voting for approval or review is necessery to a publication version. Since publication requires factual verification as well as simply checking for vandalism one can't assign publication review to random logged in users, those users who know something about the project must be allowed to effect the review (let the experts who browse/watch pages in that area object or help). Since users choose what changes they effect some system of reputation or status needs to prevent someone from creating a host of accounts to vote their changes through. Therefore something like reputation needs to be established for accounts before letting them vote publication changes.

This might seem like alot of trouble when things work pretty well now. However, the advantage of a published version lies presciscly in it's reliability (if all we wanted was stability we could just work some link to a prior version). Furthermore as the success of the project grows the value of subtle factual attacks increases dramatically. If wikipedia starts become a widespread resource politicians and corporations as well as individuals have a higher incentive to maliciously modify the published version to suppress uncomfortable facts. Also as a project like wikipedia finishes more and more areas of common knowledge the ratio of attacks to valid edits will increase. If we want to deal with this, and we must if we want the advantages of a published version, I don't see any simpler options. Once we have a reputation and moderation system the validation vs. non-valid part is relatively easy to implement and seems to offer costs and many benefits. Most importantly this system prevents vandalizm from reaching the wiki or at least the indexed/browsed section so will remove any incentive for vandals to continue/make modifications thus despite appearances reducing the total amount of maintence time. Logicnazi 11:09, 28 Aug 2004 (UTC)

Verification/ Double-keying edit

Why not create a "stable" status when a null edit is performed by a different user (different registered username or different IP address) thus validating the previous edit? This double-checks edits without being un-wiki about it.

Of course, this doesn't exactly address what "stable" is supposed to mean, but I tend to think that it means the default page that's shown to someone who hasn't set preferences and isn't a logged-in user; and yeah, it's what would be used to create printed articles for publication. Demi 17:53, 4 Mar 2005 (UTC)

Simple policy edit

As an alternative for the suggestions above, I would suggest a new level of page protection called 'limited protection'. 'Stable' might also be a good term. Pages with this level of protection can be edited by all users who have been registered over 1 month ago.

This way we avoid overhead by counting reversals as suggested above. I estimate that software changes are minimal (cur_resticted can get an extra value) though table user needs a new field date_registration. For most wiki's, it should protect the main page against vandalism while allowing most users to edit it. TeunSpaans 11:28, 8 Mar 2004 (UTC)

I like this idea. I have a slightly more complex idea like this, about offering layers of protection; flagging IPs/accounts as "possible vandalbots", "possible vandals", "suspicious", or on the flip-side "10-hr"/"100-hr"/"1000-hr" editors (it's easy to create a hundred accounts and wait a month before using them; hard to spend 10 hrs editing with each of them) -- assuming editors to be casual, well-intentioned editors unless flagged -- and using those flags to determine which pages a user/IP can edit. Sj 06:17, 29 May 2004 (UTC)Reply[reply]

Less than 15 characters edit

My idea is that you should make it so that you cannot edit a page to make the text on the page less then 15 characters long, like I did on my wikiweb program.

A problem with that is that non-sysops who finds a page with very questionable content (use your imagination...) would no longer be able to blank it, and it might be somewhat diffucult to make sure that every occasional user knows about {{msg:rfd}} (or perhaps subst wold be necessary to circumvent your restriction). Please feel free to try to persuade me (whoever you are:) \Mikez 06:59, 6 May 2004 (UTC)Reply[reply]
Well, in such a case, the blanker should always leave a note as to why it was blanked! Simple enough.
And who does this help? Sure nearly every edit of this type is going to be vandalism (though not all...what about creating a new page with a short note about what goes there?) but unless you are really having with accidental page deletion whats the point? If you are delibrately vandalizing this does nothing but require a copy-paste. Logicnazi 09:36, 28 Aug 2004 (UTC)
There are a variety of ways a "short" page could be made valid: {{stub}}, etc. I can't see any encyclopedic article being shorter than a paragraph, and 15 characters is a lot less than that. 17:45, 4 Mar 2005 (UTC)

Might a good faith permissive attitude to "vandals" conceivably work? edit

I'm thinking of the example of en:Summerhill School where the policy of allowing "troubled children" to smash windows, glasses, tease teachers, break rules, etc. was actually quite effective in reforming them!

Yes. It's called "wiki". It works great, when you let it. AssumeGoodFaith, ForgiveAndForget, RadicalInclusiveness and all that jazz. It's the theory that's made Wikimedia great; it's just too bad that misguided people keep wanting to "fix" it. --Evan 02:33, 16 Nov 2004 (UTC)
Assuming good faith works well on human vandals, but is useless against soul-less spam-robotbots -- scripts that randomly insert links for porn or black-market drugs into wiki pages.
There's a simple and effective way to prevent bots. Generate an image with distorted letters and require the submitter to enter those letters in a text field. Easy to implement, works great... Websites like Hotmail and Yahoo have been doing this for years. NL 12:41, 5 Apr 2005 (UTC)
They've being doing this for years with no consideration to visually impaired users.-- 01:38, 22 May 2005 (UTC)Reply[reply]

Enhance existing Wiki tools edit

Why not mark users that are known or suspected in vandalism in the Special:Recentchanges and in the page history? Then filter them out if you wish and examine their edits. Use "Vandalism in progress" and "RC patrol" to collect such data. That will greatly simplify spotting vandalism - 11:38, 16 Dec 2004 (UTC)

Or, there should be an anti-vandal bot, which would look for suspicious edits. See en:Wikipedia:Types of bots for this proposal. 16:54, 12 Jan 2005 (UTC)
this will greatly simplify spotting supposed vandalism, not real vandalism. and wrongly vandalism accused contributor are contributor thrown out of wikipedia by the wikipedia vicious circle. I believe that one problem is the accuracy of vandalism spotting. The wiki tool we need is a team of efficiency selected trained voluntary vandalism hunters. Such a team would be way more efficient than a stupid bot, but could use bots to help them find what could be vandalism Izwalito 02:16, 9 Mar 2005 (UTC)
Ok, then rate usefulness of their edits to make things not so black and white, like SlashDot. Then volunteer vandal hunters can filter users with little to no useful edits and examine it more closely. 13:53, 4 May 2005 (UTC)Reply[reply]

Ask for help schools and universities edit

Also, I have noted that a lot of vandalism comes probably from school and university students (sharing the same IP address). If teachers or scholars of such institution find Wikipedia useful (or use it in class, there was a page about it), they could find and warn individual perpetrators. 09:39, 13 Jan 2005 (UTC)

I have seen too much of vandalism coming from high-school IPs to hope for remedy. Those IPs w/o good edits should be blocked for editing. It would be the best solution for all. Pavel Vozenilek 20:30, 20 Apr 2005 (UTC)
Maybe the help by the universities comes from another direction. Take a look at this:

research on automatic vandalism detection -- 18:21, 19 December 2007 (UTC)Reply[reply]

Borrowing email antispam methods edit

Some email-based spam-filtering methods, such as SURBL - Spam URI Realtime Blocklists could be highly effective in controlling certain types of vandalism, especially wiki-spam/spamdalism (insertion of off-topic commercial links)

creating a wikischool edit

a wikischool would be a place where experimented users will teach their knowledge and savoir faire to willingful students. this would be a good place to get an answer from specialists. such questions and answers could then be compiled and categorized inside a FAQ. A wikischool would be the perfect place to teach and learn foreign languages, train admin/bureaucrat/contributor/copyright violation hunters/vandalism hunters wannabes Izwalito 02:56, 9 Mar 2005 (UTC)

watchlists edit

I like the idea of having a "stable" version of each article, but I don't know how it could be realistically implemented.

"I AM SO COOL" vandalism is easily fixed and reverted as long as someone somewhere is watching the article. Malicious fact-changing vandalism is the kind I am really worried about. Changing a year by one digit, etc. The only way to fight this is if the person who added that year in the first place (or someone else who knows the fact; but they aren't guaranteed to exist) is still watching the article.

The only things I can think of are:

  • Encourage good editors to keep articles they know about on their watchlists
    • Make "Add pages you edit to your watchlist" checked by default for new users
      • But people will hate this, so maybe make it exclude minor edits
        • Either because the editor marked it as minor
        • Or because the edit only changed <10 characters
        • Or either of the above
    • Or just make it the default to add to watchlist if you create an article, regardless of your preferences.
  • Make it seem mandatory for anonymous editors to provide references when they change small facts. It isn't really, but scare them into it. Then when someone changes something without a reference they are more suspect.
  • I don't know

A good fix is a social fix, though. A tech fix just encourages people to circumvent it.

- Omegatron 22:05, 24 Mar 2005 (UTC)

China have a key words watchlist, couldn't we have something like that on wiki? Word like F**K and phrases like "I am so cool" can be blocked if they are use alone and not part of a paragraph. It would stop small vadalism without stopping small time edits. :) 02:03, 29 January 2006 (UTC)Reply[reply]

Ban open proxies edit

Wikipedia should crack down on open proxies which are favourite tools for vandals. For example:

  • ( 03:41, 3 Apr 2005 (UTC)

If proxies are to be banned than why IP range blocks are discouraged ? Wojsyl 11:04, 8 Apr 2005 (UTC)
On Wikinews i suggested that maily because of one vandal who is requesting sysop stsus to stop, and clams he/she is making a bot to vandalise wiknews, i ofund the vandal keeps using open proxies, but the reson they gave wich makes sense when you think of it, their are places in the world where all they have to use are open proxies and the idea of a wiki is to let everybody edit, ecept vandals.-- 16:56, 27 May 2005 (UTC)Reply[reply]

Warnings on pages being repeatedly vandalised edit

I don't know if MediaWiki would support this, but is there some way of forcing pages to display a warning to readers if they are being repeatedly vandalised? I.E. "This page is currently being vandalised. This may make this page garbage or it may be very subtle. Try again later."

Obviously, this would need to be a sysop power, but less drastic than banning, and a good warning to innocent passers-by.

Blacklisted word anon edit summary page edit

Special:Edits by anonymous users with the words "poo", "gay", "rule" or "suck" in them

With a checkbox and an auto-generated summary of the suspect phrase next to each and a blanket rollback button at the bottom for all selected. :-) - Omegatron 20:56, 7 Apr 2005 (UTC)

Limit anon edits edit

Could we limit the number of anonymous edits to one a day per IP ? Why would anyone not register if he intends to make more edits anyway ? Wojsyl 19:33, 11 Apr 2005 (UTC)

  • Disagree. Some people prefer to stay anonymous for very good reasons. Sometimes people want to add some edits to, say, a poll on the german wiki that they will never visit again, etc. - Omegatron 20:36, 11 Apr 2005 (UTC)
  • I do not understand why do some people wish to remain anonymous when editing. Alan Liefting 22:10, 22 Apr 2005 (UTC)
    Because they want to make edits that are not associated with their username. Because they want to make edits on another language pedia without spending the time to register because they won't ever use it otherwise. Etc. - Omegatron 02:19, 23 Apr 2005 (UTC)

Here is another thought don't change IP 'saves' immediately, instead redirect them to recent changes patrol for approval. Once approval they can be save them on the page. Also I think the wiki policy is too strict, all I did is make a joke edit and I got a warning to stop vandalising Wikipedia! It was a very funny joke!!!!(To me at least, administrators have not sense of humour!) I think a few light hearted jokes on wikipedia wouldn't or at least shouldn't hurt; don't you? Adding a few to a few articles could really losen up Wikipedia, or put a jokes related to that article section on each article like the 'also see' section. Heck, there could even be a jokes portal for jokes of all kind divided into articles. For the warning templates, a suggestion, instead of subst:test2, subst:test2a or subst:test3; how about subst:test3a/and or subst:test2b: Please stop adding nonsense to Wikipedia. It is considered vandalism. If you enjoy vandalising please try Uncyclopedia. Thank you. I think the aforementioned suggestions would serve to reduce vandalism on quite a bit on Wikipedia, without violating the anyone can edit policy :) by the way I am a IP/non member :D 01:55, 29 January 2006 (UTC)

Change to software to hold anon edits until peer reviewed. edit

All the vandalism I have seen is done by anon edits. I have just reverted some four day old vandalism on Wikipedia. Can the WikiMedia software be changed so that anon edits are not saved until peer reviewed by registered users? Logged in users can then check to see if the anon edits are genuine or are vandalism. At present it requires vigilance at the Recent Changes page to stop vandalism. With 500,000 article there must be a better system in place to preserve the intergity of the database and become a respected reference. Alan Liefting 22:00, 22 Apr 2005 (UTC)

Cannot be idealistic with 500,000 pages. edit

With over 500,000 articles at stake Wikipedia should not be held to ransom by those who vandalise with anon edits. Anon edits should not be allowed. It is a huge source of vadalism since editing is made to be too easy. It would be at the expense of those who do not have an account and wish to edit articles. This is unfortunate but I feel that it is a suitable compromise. Alan Liefting 20:09, 27 Apr 2005 (UTC)

I never edited any wiki which required registration. It's just too much hassle to make some small edits from time to time. And thanks, I already have about fifty accounts around the Internet to remember yet another login-password. 09:05, 20 February 2006 (UTC)Reply[reply]

Program for viewing RC edit

I'm working on a program that makes RC viewing, in my opinion, easier and more efficient. You can check it out at en:User:CryptoDerk/CDVF. I'd like to get some feedback and maybe you'll find it useful enough to use. I don't feel right going around spamming places with links to it, and I'm not overly familiar with meta (I'm an admin on en and commons), so if you feel that there's a proper place to put a link to the page, please link it. CryptoDerk 19:27, 9 May 2005 (UTC)Reply[reply]

Wow, cool. I haven't tried it yet, but it looks terribly useful. Don't be afraid to put more links where people will see it. - Omegatron 16:20, 10 May 2005 (UTC)Reply[reply]

Limit links edit

Limit the number of links to like 90% in pages.

Devalue the content edit

This is probably a bad idea, but for the bot spammers, they mostly exist to boost page rank on google. If google were petitioned to no longer use publicly editable web pages in their ranking system, the bots would be less valuable to their owners, and so would go away. The content could possibly just be valued less for enough of an effect. There are clearly disadvantages to this:

  • Some bots also attempt to increase random chance of someone going a website. This would not address them.
  • Wiki content then becomes, well, devalued by google and likely harder to find through google.

Unless that second point can be addressed, I wouldn't consider this a viable option. Still, thought I'd throw it out.

Don't change IP 'saves' immediately edit

Here is another thought don't change IP 'saves' immediately, instead redirect them to recent changes patrol for approval. Once approval they can be save them on the page. Also I think the wiki policy is too strict, all I did is make a joke edit and I got a warning to stop vandalising Wikipedia! It was a very funny joke!!!!(To me at least, administrators have not sense of humour!) I think a few light hearted jokes on wikipedia wouldn't or at least shouldn't hurt; don't you? Adding a few to a few articles could really losen up Wikipedia, or put a jokes related to that article section on each article like the 'also see' section. Heck, there could even be a jokes portal for jokes of all kind divided into articles. For the warning templates, a suggestion, instead of subst:test2, subst:test2a or subst:test3; how about subst:test3a/and or subst:test2b: Please stop adding nonsense to Wikipedia. It is considered vandalism. If you enjoy vandalising please try Uncyclopedia. Thank you. I think the aforementioned suggestions would serve to reduce vandalism on quite a bit on Wikipedia, without violating the anyone can edit policy. Note: my IP is, this is a repeat. Pseudoanonymous 22:39, 7 February 2006 (UTC)

Well, this is something which is very complicated to guess... One one hand there are many good articles, and on other hand - many bad articles... For bad articles - people go anonymous, for good articles - they register, use forum etc... But anonymous goes on for some short time... Usually one sticks to his interests... And as for IP-addresses saving... Well, what can be done better? — The preceding unsigned comment was added by (talk) 05:20, 13 August 2010 (UTC)Reply[reply]
This may be the kind of thing you're after. sonia 05:22, 13 August 2010 (UTC)Reply[reply]

Go to the vote to prohibit anonymous edits edit

The vote to prohibit annonymous editsis a non-binding poll to determine the Wikipedia consensus on prohibiting anonymous edits as a method to reduce vandalism. The results will be forwarded to the WikiMedia board of trustees as a recommendation. Go Vote! -Kaiwen1 19:20, 17 February 2006 (UTC)Reply[reply]

Additive editting only edit

To reduce some of the vandalism how about dissallowing anonymous editors from deleting more than a few characters at a time 12:40, 21 September 2006 (UTC)Reply[reply]

Two Ideas for reducing vandalism. edit

Here are a couple ideas I haven't seen anyone mention yet (which doesn't mean nobody's put them out there before... just that I haven't seen them.) Both ideas are "passive" in that they don't require blocking and might fit well into the "Wikipedia philosophy" if you will.

First, in the "warning" messages on vandals' talk pages it would be nice if there were a way for the vandal to be informed how long their vandalism was actually up on the page itself. I've noticed that those pages that are vandalized often and/or with any regularity also seem to get reverted very quickly. Sometimes its just a matter of minutes. Vandals should be informed (to the best of our ability to inform them, that is) that their vandalism is short-lived, particularly in those cases when the vandalism is very short-lived. If they want to do something a bit less temporal they should try MySpace....  :)

Second, there should be some sort of effort to inform (again, to the best of our ability to inform) vandals how incredibly easy it is to revert their vandalism. Again, in the majority of cases I've seen it's easier (in terms of fewer mouse clicks and/or keystrokes) to revert vandalism than it is to produce the vandalism itself. I showed a student of mine the other day how simple it was to revert vandalism ["Watch this: click, click, click, click. There. All gone."] and we had a nice chuckle over how much effort some guy had gone to in order to vandalize a page, but he was so easily countered.

The big problem here, of course, is informing people who are apparently not the sort who read a lot, but if they can somehow be informed that their "contributions" are ineffective they'll return to tormenting ants or whatever it is they do in their free time.... -- 16:07, 9 November 2006 (UTC)Reply[reply]

Automatic vandalism identification edit

Perhaps new edits could undergo heuristic filtering, similarly to the way e-mails are filtered? Also edits from untrusted users/addresses could increase the probability of the edit being flagged as potential vandalism.

vandalism protection edit

I think that wiki should have an automatic vandalism protection. Perhaps... a word/phrase filter. It should stop any posts that goes to the content page that contain the words, or phrases:

fag f@g fags faggot gay thats so gay lesbian lesbo lezbo stupid lezbo greasy mexican loser asshole fucker prostitute suck my penis suck my vigina suck my pussy kiss my ass hoar

these are just some of the examples i wish there would be a filter for. I think that adding a filter that would block these words/phrases as well as other rude words/phrases would be helpful to wikipedia. The preceding unsigned comment was added by (talk • contribs) 23:24, 15 April 2007 (UTC).Reply[reply]

What about en:Gay, en:Lesbian, en:Prostitution and so on? I don't think a badword list would be a good idea. --.anaconda 23:51, 15 April 2007 (UTC)Reply[reply]

KISS (Keep It Simple Stupid) edit

No edits allowed with an IP address. You have to be a registered user. Also easier access to Admins to do the block when its seen. It took me 3 weeks as a registered user just to find a list of Admins. Perhaps a link on the bar at the left. Leobold1 18:12, 18 February 2008 (UTC)Reply[reply]

Preventing vandalism from getting buried under newer edits edit

One of the biggest problem with vandalisms (especially the ones that pose as good-faith edits rather than just being nonsense) occurs when they get buried under new edits and go undetected. I propose a solution to reduce that:

Every time an editor is about to submit a change (using the edit form), he/she will have the option (and will be encouraged) to confirm that he/she has checked the previous (most recent/ top) edit, and to the best of his/her understanding that the previous edit is in good-faith. This can just be another check-box beside the "this is a minor edit" check box. If he/she checks the box, then we can at least be sure that if the current editor is of good-faith, then if the previous edit was a vandalism it won't get buried below the current editor's edit. If he/she does not check the box (which he/she is free to do if he/she does not want to take the trouble), then a tag will be placed/associated with the previous un-checked edit (let's call the tag "unverified"). Later on, any other confirmed editor / administrator can easily identify such "unverified" edits, the checking of which were skipped, and remove the "unverified" tags if the edits were not vandalism. (i.e. say v0 -> v1 -> v2 are consecutive versions. Then the editor of v1 -> v2 will have the option of setting verified/unverified tag for v0 -> v1.)

Of course, undoing a previous edit won't require checking the previous edit. Moreover, undoing an edit will invalidate the "verified" status of the last-to-last edit if it was set to verified (i.e. v0 -> v1 -> v2 -> v3 were the consecutive versions, and v2 -> v3 was an undo, then v0 -> v1 will be set as "unverified" even though the editor of v1 -> v2 had set it as verified). This will prevent an earlier vandalism from getting buried if there are 2 or more consecutive vandalisms. The idea is to keep all the vandalisms on the top as much as possible, and make editors of new changes verify that they don't write on top of vandalism without correcting or tagging them.

- Subh83 08:24, 13 April 2011 (UTC)Reply[reply]

Red highlighting the edit button and, providing changes as permissions. upon completion of tutorial and links in that for new users signing a waiver after, then " auto" allowing editing and the red button UN-highlighted in red allowing use of editing edit

I began to say it is commonly done to link edit button, or like button to tutorials, and I wish to addresses the edit button on the site everyone nemesis, I am blocked for that. also vandalism issues. proposals: the edit button. the edit button.

red highlight it, red highlight the edit button, simply put the the tutorials on the site, to it first, at the link. until they signed a further waver, permission to edit button at a link, provided, and the red edit disappears after that. the party is no longer a trial user after that. and my edit. it is the concept and a concept change, just change it to a red highlight until the issue of the waiver is competed, and a tutorial used, simple isn’t it? new users don’t understand, and use the edit button right away thinking they are not doing any harm. because of that, this would, began, to solve everything, that pertains. and addresses constant vandalism issues, which I also I have right now. I am blocked for that. may i be credited for the change? everyone wanted to claim that it seemed. it took there days to find you. that’s the proposed program-me. currently blocked jay7bird also at

have a great day sincerely J.L.S. supreme citizen justice sate of Utah. birth name jay lynn swett my profile or jay7bird at {Jay7bird 00:51, 20 May 2011 (UTC)}Reply[reply]

Return to "Anti-vandalism ideas" page.