Bot aid

A lot of small projects (or even WikiProjects on larger Wikimedian projects) don't have people who are able to make bots. This page intends to become a place for people who don't know to work with bots -- to ask for help.

This page is for internal Wikimedian usage and help to other non-profit projects. For hiring bot programmers and offering work with bots, look at WikiHR.

People who are willing to helpEdit

  1. --Millosh 23:41, 8 April 2007 (UTC)Reply[reply]
  2. FrancisTyers 23:53, 8 April 2007 (UTC)Reply[reply]
  3. Betacommand 01:57, 9 April 2007 (UTC)Reply[reply]
  4. ST47 (en:)
  5. Menasim 11:40, 11 April 2007 (UTC)Reply[reply]
  6. Alai 04:17, 13 April 2007 (UTC)Reply[reply]
  7. Andre Engels 02:16, 16 April 2007 (UTC)Reply[reply]


Please describe your needs below.

  • Hmm was having a think but is there a bot that could link articles on wikipedia related articles on other sister projects. (i.e. An article on 'pedia about Dave Jones is linked via wikipediapar template to a wikinews article about him). Not sure if one already exist but would it be possible to create one??--Markie 21:45, 12 April 2007 (UTC)Reply[reply]
    • That would be easy to create, the bot would just have to be programmed to do a search for a specific article on the different wikis. Cbrown1023 talk 22:31, 12 April 2007 (UTC)Reply[reply]
    • It is possible, but it would need to be supervised. - FrancisTyers 08:20, 13 April 2007 (UTC)Reply[reply]
      • The question is does Markie know to program bot? If he doesn't know, some of us should help him. --Millosh 09:02, 13 April 2007 (UTC)Reply[reply]
        • no sorry ive got no idea how to code a bot. if you think its worth it and would be a good idea then go ahead. i was throwing an idea that i thought would be good into the ring. --Markie 09:20, 14 April 2007 (UTC)Reply[reply]
  • Could you help with creating a local version of the CommonsDelinker, that would delink pictures on a single project upon deletion? I have found the source of the CommonsDelinker, but it seems a bit too complicated to me, to modify for my purpose. --Dami 12:22, 26 May 2007 (UTC)Reply[reply]
    • Im actually working on a similar code, I just have a problem retrieving the list of images from the deletion log. Betacommand 18:56, 30 May 2007 (UTC)Reply[reply]
  • I would like to have a bot to replace in my wiki all keywords, for which wiki pages exists, by a link to this page. Is this possible with pywikipediabot or other bots? --Christian 12:58, 18 June 2007 (UTC)Reply[reply]

Currently functional botsEdit

Bot CreationEdit

I'm a regular contributor to a guitarists' sheet-music wiki called TabWiki (MediaWiki Version 1.4.2). I think some simple bots which check for proper categorization and template usage would be very useful for the site, but I couldn't find any documentation or source code online for how to program it. I understand that they tend to use the screen scraping technique, so I think I can retrieve the information I need from the site (e.g. Does this page have the such and such category added and the so and so template at the top?) Where I'm really stuck is how to post that information to the wiki? I suspect I have to use a series of "action=submit&wpSave&wpTextBox1=TEXT&wpStartTime=20070424..." stuff, but how exactly do I do it? Is there an online manual I can read? Sample source code? How can I post large amounts of text to "wpTextBox1" and make sure there is no problem putting arbitrary text as part of the URL? Is there a better (but still easy) way to do it?

I am somewhat of a novice C# programmer, but I feel comfortable with the language and I could learn whatever I need to get the job done. Any advice would be greatly appreciated!

-RockyRaccoon 05:34, 24 April 2007 (UTC)Reply[reply]

Bots and the Snoopy ClassEdit

I have WikiMedia set up on a PC running Windows XP professional with Apache Server 2.2.6 running and am using PHP v5.2.4 for the coding.

I am using the Snoopy class to edit an internal wikimedia project. I login successfully as the WikiSysop with the snoopy->submit() method. The http request return code is 302, which is good in this case.

Next I access the WikiSysop page with the snoopy->fetch() method. I have a plethora of echo statements in my PHP code, so I verify that I am getting what I expect. The http return code is 200 in this case. I am not sure if that is good or bad. I add a short string - about 50 basic characters - at the end of the textbox1 text. The size of the page that I am accessing is about 20Kb.

Next I use the snoopy->submit() to write the edited page back to the wiki. Again, the http request return code is 200 which, in this case, is bad. When I log back in as the WikiSysop, no changes appear.

So, I have this problem with the status/return code of 200. Any help and/or suggestions are appreciated as to correct the code so it edits.

I thank you and my puppies thank you. Roy F. Dvorak, Westminster, Colorado Roy 16:21, 25 October 2007 (MST)