User talk:Beetstra/Archives 2016

Latest comment: 6 years ago by Beetstra in topic Please restart COIBot

1h 6m fwiw

… that is the current amount of time that COIBot takes to respond to "whereadded", "report xwiki …" commands. :-/ and that other responses are still pending and that time gap has passed.  — billinghurst sDrewth 11:34, 18 January 2016 (UTC)

Still the old problem - the database is growing so unseemingly big that it starts to seriously lag in counting .. I need to restrict COIBot in a different way. --Dirk Beetstra T C (en: U, T) 11:37, 18 January 2016 (UTC)

Grants:IdeaLab/Bot to detect and tag advocacy editing

Would be interested in your thoughts on this, if you didn't see it. Jytdog (talk) 21:33, 8 March 2016 (UTC)

Thanks for the remark. I'll think about it for a moment - part of this is done by edit filters and COIBot, but this goes further than that. You'd need the input of that antivandalism bot system on en.wikipedia, that scores vandalism. --Dirk Beetstra T C (en: U, T) 03:17, 9 March 2016 (UTC)
Hm! I have no idea how the back-end of all these things works. One of the criteria in my mental filters is "unsourced content" so if that is part of what vandalism-bot looks at, that would be helpful. I imagine some of the semantic filtering too. But i have no idea. ack. Thanks tho for considering this. Jytdog (talk) 03:41, 9 March 2016 (UTC)
The following things are easy:
  • COIBot: check username / IP vs. domains added or text added. If 'CompanyX' adds text mentioning 'CompanyX', or the page 'CompanyX', then there is .. reason for concern (it may be fine).
  • SPA-like accounts. Editors focussing on one company/organisation. COIBot does follow that as well.
  • Typical text is filtered in e.g. - though only for non-draft userspace edits. One could do something similar on mainspace (but we've been reluctant because that may be heavy on the server).
One of the antivandalism bots works on the basis of heuristics, it 'learns' which terms or type of edits are bad (if 'poop' is bad, then likely 'pooop' is also bad), and the smarter it gets the better it gets. Problem with COI edits is that they should not necessarily be reverted, just tagged (as opposed to vandalism) - but you could teach a bot 'suspicious' content in COI terms in the same way ('we are the best in ..' should be flagged). So I'd consider an edit filter-like system, not a reverting or tagging bot. --Dirk Beetstra T C (en: U, T) 09:40, 9 March 2016 (UTC)
super helpful. do you think that project is likely to get picked up, and if so would you be interested in getting involved? I will be mostly useless as I understanding nothing of the technical aspects here. but i am happy to learn and do whatever i can bring... Jytdog (talk) 23:57, 9 March 2016 (UTC)
I am certainly willing to help and give suggestions. I may not have time to actually create bots or edit filters (or whatever) for this. --Dirk Beetstra T C (en: U, T) 06:21, 10 March 2016 (UTC)
Gorgeous, thanks! Jytdog (talk) 13:13, 10 March 2016 (UTC)

COIBot/LiWa bots lag

Hi. Is it possible to restart all those bots? They ain't replying to any commands on IRC. Best regards, —MarcoAurelio 14:24, 13 August 2016 (UTC)

@MarcoAurelio: - COIBot seems to have responded this morning to some commands (I saw an IRC-commanded revertlisting on XLinkBot), is it only the linkwatchers, XLinkBot? --Dirk Beetstra T C (en: U, T) 09:10, 14 August 2016 (UTC)
Hi. Yesterday none of them were answering my commands (whoadded/whereadded). I'm not on IRC now so I can't test. Best regards, —MarcoAurelio 16:39, 14 August 2016 (UTC)

Linkanalysers running? It needs an expert's look

Hi Beetstra. Are the linkanalysers functioning or a sufficient number of them running? backlog tells me linkanalyser: 327837 secs. The LinkWatchers are repeatedly bleating Started linkanalysers - None seemed to be alive when it loads a backlog file. When I run show processes I am only seeing plenty of parser, and one of each of diffreader and linksaver, nothing for linkanalyser. I get told to go away when I try to start a linkanalyser process. Would you mind having a look? I cannot better remember the diagnostics, and my notes don't cover this. Thanks.  — billinghurst sDrewth 07:20, 27 August 2016 (UTC)

Oh, I do remember a command (with some editing of output)...

<XLinkBot> I've been active for 8 days, 08:20:48 hours. No records waiting to be reverted. I have received 2110 requests.  Average reversion time is 1167.55 seconds (all 219 reverts); 71.78 seconds (42 reverts < 120 seconds). Fastest reversion 13 seconds, slowest reversion 168384 seconds.
<COIBot> 204 reports in 9 days, 08:19:21 hours (0.01/min)(5 normal, 81 xwiki, 0 ip, 0 meta, 22 poked). 269 edits in parserqueue. Last RC 00 seconds ago. Last LW 3 days, 19:07:46 hours ago. Report when overlap exceeds 25%. Reporting to #wikipedia-spam-t and #cvn-sw-spam (limited) and #wikipedia-en-spam (all); Reportlevels: RC is 0
<UnBlockBot> sDrewth: Uptime: 3 days, 19:29:12 hours. Last RC 01 seconds ago.
<LiWa3_1> LW: 1 day, 15:24:00 hours active; RC: last 0 sec. ago; Reading ~ 812 wikis; Queues: P1=0; P2=3; P3=1 (6362 / -1); A1=1061; A2=85 (0 / 0); M=0; Total: 1916054 edits (810 PM); 92949 IP edits (5.3%; 39 PM); Watched: 1726062 (90%; 730 PM); Links: 66159 edits (3.8%; 27 PM); 233132 total (98 PM; 0.13 per edit; 3.52 per EL add edit); 0 WL (0%; 0 PM); 0 BL (0%; 0 PM); 0 RL (0%; 0 PM); 0 AL (0%; 0 PM)

which I think is telling me that there are over 1000 links in the main ns queue.  — billinghurst sDrewth 07:28, 27 August 2016 (UTC)

Oi, no analysers were starting. Solved the problem, I hope. --Dirk Beetstra T C (en: U, T) 10:30, 27 August 2016 (UTC)

No linkwatcher

Hi COIBot has been abandoned by its linkwatcher elven mates? It would appreciate if they could be sent back to make more useful reports. Thanks.  — billinghurst sDrewth 10:33, 29 September 2016 (UTC)

Note, I can see them on irc.wikimedia, just not at freenode.  — billinghurst sDrewth 10:37, 29 September 2016 (UTC)

COIBot creating weird reports

Hello. Your bot should avoid creating reports like User:COIBot/LinkReports/беларус.рф. I've created blocked the bot, because I spent a lot of time querying the site for this kind of pages and immediately after deletion the bot started recreating them again. Thanks. —MarcoAurelio 10:07, 16 October 2016 (UTC)

@MarcoAurelio: Those bloody 'weird character' urls, where the encoding becomes crazy. I'll have another look where it loses the encoding, but blocking the bot over it is a bit too much, you're now disabling a lot of XWiki spam work because of that. I'd prefer a salting of the pages. --Dirk Beetstra T C (en: U, T) 11:37, 16 October 2016 (UTC)
Also, this is, what, 20 unique pages, 40 saves, over a month time, where it saves way more in a day if it is running at full speed .. --Dirk Beetstra T C (en: U, T) 11:44, 16 October 2016 (UTC)
@MarcoAurelio: Really? you blocked it for that? Probably a good idea to check prior to blocking the bot. :-/ FWIW I have been around in IRC just need to poke me directly, or find me in a channel.

FWIW it has been doing that behaviour for years. The report is being recreated as the bot is stepping through a backlog and finding other cases of addition. It is not an problem, either just wait out the additions and close it normally, OR get COIBot to ignore them (one of the available options) if they are an issue for you. If you wish to find the url being added just follow the contributor's luxotools link and you will see what is being added on whichever unicode url it is, usually something in the cyrillic alphabet.  — billinghurst sDrewth 14:22, 16 October 2016 (UTC)

P.S. I unblocked it. I needed it for a number of tasks that far outweighed the report generation issue.  — billinghurst sDrewth 14:23, 16 October 2016 (UTC)
Thanks if you are going to have a look at that. And yes, after ~200 deletions of useless weird encoded reports, that the bot continue to recreate a minute after them angried me. I'll try to create a blacklist rule for those. Thanks for your understanding. —MarcoAurelio 10:34, 17 October 2016 (UTC)
Useless, User:MarcoAurelio? For the one you just protected, it was informative to find maceralardünyası.com as the link .. for others, we still can use them to find the original spamlink, and to see who added it or where it was added - some of that stuff should be blacklisted. So no, these reports are not useless. --Dirk Beetstra T C (en: U, T) 10:46, 17 October 2016 (UTC)
Sorry for not being so smart to guess what the link /беларус.Ñ€Ñ means, and I feel many of us would find them useless as well since the report provides no information unless of course you can see the bot logs to figure out. Feel free to amend or revert, but my time is precious to have to guess report by report what the bot wanted to report. I'm not a cryptographer. I won't delete further report if you wish, but if you could please have a look at those encoding issues I'd appreciate it a lot. Thank you. —MarcoAurelio 10:57, 17 October 2016 (UTC)
No, I solely used the report. --Dirk Beetstra T C (en: U, T) 11:06, 17 October 2016 (UTC)
Right. In any case, sorry for the inconvenience or if my wording is unclear or aggressive. Not my intention. Regards, —MarcoAurelio 11:23, 17 October 2016 (UTC)

Link Watchers

Hi, we spoke a couple of months ago about LiWa in fawiki's channel and apparently you missed the last message i left you. I was on a Wikibreak until now and upon return, i noticed that the problem was still standing. Any thoughts? Thanks. -- Arian Talk 08:25, 25 October 2016 (UTC)

I'll have to have a look at that .. I guess for some commands you are not 'trusted' .. I'll ping you when I have a minute to have a look at it. --Dirk Beetstra T C (en: U, T) 08:30, 25 October 2016 (UTC)

Link analysis seems to have stalled

Everything seems to be present and functioning from IRC-side, however, the link analysis and thus the writing of link analysis is not present, see special:contributions/COIBot. Thinks poked at the analysis through IRC seem to queue and progress, just no output. No warnings are generated. syslog looks okay Syslogs: case: 151824 secs. coibot: 141 secs. commander: 0 secs. diffreader: 938 secs. linksaver: 85 secs. output: -1 secs. parser: 964 secs. script: -1 secs. special: 13027317 secs. Yours flummoxedly.  — billinghurst sDrewth 22:11, 28 October 2016 (UTC)

@Billinghurst: The bot has a massive backlog .. I guess it does not do too much of analysis to speed up getting rid of it. --Dirk Beetstra T C (en: U, T) 13:49, 29 October 2016 (UTC)
Really? it says that it is stepping through the backlog output files, writing nothing; it undertakes its xwiki requests, and writes nothing to meta. I must be missing something as backlog and status reports have been many times worse.  — billinghurst sDrewth 03:02, 30 October 2016 (UTC)
Not really :-) .. it got logged out. Solved .. I hope. Please let me know if there are more problems. --Dirk Beetstra T C (en: U, T) 03:23, 30 October 2016 (UTC)
Okay, that makes sense. It is writing again. If there is a check that can be run, beyond staring at its absence, please let me know.  — billinghurst sDrewth 10:48, 30 October 2016 (UTC)
I think it tells you as an error message in a channel. Maybe should be added to my bot-monitoring channel. --Dirk Beetstra T C (en: U, T) 10:54, 30 October 2016 (UTC)

Announcing a new mailing list for Meta-Wiki administrators


As a regular administrator on Meta-Wiki, you're allowed to subscribe to the recently created metawiki-admins mailing list. This is a closed mailing list for announcements, asking for help and discussion between Meta-Wiki administrators. If you wish to subscribe, please fill the form at this page and then contact Savh or MarcoAurelio via Special:EmailUser using your administrator account so they can verify the authenticity of your request and address. You'll find more information on the mailing list description page. Should you have any doubts or questions, feel free to contact any of us. We hope that this tool is useful for all.

Best regards,
-- MarcoAurelio and Savh 12:30, 9 November 2016 (UTC)
Mailing list administrators for metawiki-admins mailing list.

Message sent to members of Meta:Administrators/Mass-message list. Please see there to subscribe or unsubscribe from further mass messages directed to the whole group of administrators.


Hi! I wonder why the bot does this kind of stuff? Repeating each minute that the link is already blacklisted does not seem like a good plan :) Regards, —MarcoAurelio 15:51, 21 November 2016 (UTC)


[17:02]	mafk	COIBot join rc channel #es.wikibooks
[17:03]	mafk	COIBot join lw channel #es.wikibooks

seems not to work. Can you do some magic? Thanks! —MarcoAurelio 16:04, 21 November 2016 (UTC)

The former is strange, seems like it is looping somewhere. I'll try to have a look.
The latter .. sigh .. it is something that needs a rewrite by copy from from LiWa3 .. probably too many channels in one module. Also for this, I'll try to have a look. I hope to have some time mid December for a bit better bot maintenance work. --Dirk Beetstra T C (en: U, T) 03:18, 22 November 2016 (UTC)

Please restart COIBot

Hi. I think COIBot needs a restart. It is not responding to any IRC commands. I left you a message at wikitech regarding this as well. Regards, —MarcoAurelio 16:06, 27 November 2016 (UTC)

@MarcoAurelio: I have started the bot again this morning - it appeared to have crashed. I'll check a bit in a couple of hours. (sorry for the late reply, I have limited access to the servers at the moment). --Dirk Beetstra T C (en: U, T) 05:54, 1 December 2016 (UTC)
Found a lost 15 minutes. Made some code adaptations. Problem seems to be that the LinkSaver has serious work on some domains, and that stalls the bot. I have built in a bit of a catch to keep it a bit calmer. --Dirk Beetstra T C (en: U, T) 14:36, 1 December 2016 (UTC)
Return to the user page of "Beetstra/Archives 2016".