Seems COIBot account is not AUTHing for freenodeEdit

Seeing both COIBot and Xlinkbot come and go in IRC. Tried lots of things including a reboot, to no success. Zppix believes that the account is not AUTHing, and I have no idea how to resolve that aspect, nor even how to problem solve or check that is the case, so I am handing that bit back to you. Happy to be further educated on what I should be doing. Regards arms and legs  — billinghurst sDrewth 23:53, 26 February 2021 (UTC)

Something worked automagically at some point. No idea why, and after liwa3s had disappeared. All back and functioning.  — billinghurst sDrewth 04:34, 27 February 2021 (UTC)

Sigh, bots have issues. Needs a break? COVID-19? Maybemaybe I have a bit of time end of this week to keep an eye on it and see if I can figure out what the issue is. --Dirk Beetstra T C (en: U, T) 05:40, 28 February 2021 (UTC)

bd808 asks ...Edit

Whilst I was seeking some help today BDavis asked two things re COIBOT. I can do the second, but not the first due to access.

<bd808> it would probably also be a good idea to clean up some/all of tools.coibot and add a README there saying that it is all on another Cloud VPS project now

and I went to do it and found that I don't have the access to the tool's group on toolforge to do it.

<bd808> billinghurst: write some system admin procedure docs for this project pretty please. :)
...
<bd808> billinghurst: the most basic info would be something like https://wikitech.wikimedia.org/wiki/Tool:Stashbot#Maintenance

which I will do.  — billinghurst sDrewth 11:32, 27 February 2021 (UTC)

@Billinghurst: you should get access to tools.coibot ... if only I knew how to give you that. --Dirk Beetstra T C (en: U, T) 05:42, 28 February 2021 (UTC)
https://toolsadmin.wikimedia.org/tools/id/coibot login and you can maintain maintainers  — billinghurst sDrewth 11:23, 28 February 2021 (UTC)

FYI: Racking up filesEdit

Seems that there is some DB type issue as linkwatcher is racking up link files and not processing a lot of links. I have been through both a restart of linkwatcher and a reboot of the liwa3 instance, though it keeps slowly increasing with only occasional file loading. I also did a hard restart of coibot, though no reboot, just in case.

<sDrewth> !backlog
<COIBot> No reports waiting. On Wiki: 0 open XWiki reports and 30 open Local reports.
<COIBot> Syslogs: 20: - coibot: -1 secs. commander: 0 secs. diffreader: 14658 secs. linksaver: 8 secs. parser: 4093 secs. readbl: - script: -1 secs.
<LiWa3_2> Syslogs: 20: - diffreader: 14 secs. linkanalyser: 697 secs. linkparser: 2688 secs. linkreporter: 12365 secs. linkwatcher: 125 secs. output: - script: -1 secs.
<LiWa3_3> LW: 03 hours 26:08 minutes active; RC: last 1 sec. ago; Reading ~ 864 wikis; Queues: P1=222; P2=1687; P3=90 (101 / 399); A1=1111; A2=0 (1194 / 868); M=0 - In LinkWatcher edit-backlog: 4094 files (0 lines) and in analyser backlog: 8 files (0 lines).

 — billinghurst sDrewth 02:46, 7 March 2021 (UTC)

@Billinghurst: syslog.linkparser throws strange errors - it seems that somewhere the system changed and it misfires on regexes which then results in mis-assigned edits. It looks like all edits are on 'Mw::' (as if it does not read the diffurl good), and then it tries to read diffs from en.wikipedia that were somewhere else. I don't understand yet what is the issue (whether it is DiffReader.pl that misinterprets stuff, or if linkwatcher.pl is doing something with the data). Note that also in the backlog files things are wrong, so it goes wrong between reading the diff from the feed by DiffReader.pl, and storing them by linkwatcher.pl (i.e. before it hits the LinkParser.pl). --Dirk Beetstra T C (en: U, T) 06:01, 7 March 2021 (UTC)

@Billinghurst: is there somewhere an extremely slow or throttled server? So that the bots lag because they have to wait for info/response? —Dirk Beetstra T C (en: U, T) 15:06, 7 March 2021 (UTC)

User:Sic19 thwocking WD with urlsEdit

I can see that User:Sic19 is racking up edits at WD adding official websites, and up to about 10M edits (unsure how any are recent). I have undertaken wl add Sic19 * and I hope that is the right solution. If there is something better that I can do, then please let me know.  — billinghurst sDrewth 23:33, 13 March 2021 (UTC)

@Billinghurst: yes, I guess that is it. Flooders on WD are an issue. Not parsing them and not getting the data is not really an option either (it sets a record for official websites, if I have time I could program something to remove ‘official site to subject’ from stats and become more precise). Crap ...
We should be whitelisting/ do-no-count-ing more links on WD though. Would be great if suchs flooders could inform linkwatcher and coibot beforehand ... —Dirk Beetstra T C (en: U, T) 05:50, 14 March 2021 (UTC)

Can COIBot/LinkReports be generated manually?Edit

It sometimes happens that I'm not sure whether there is enough evidence to report a domain at en:Wikipedia talk:WikiProject Spam. It would be nice if I could generate a preview of the domain's LinkReport without it being saved. This would tell me if the domain is worth reporting. —Bruce1eetalk 09:12, 5 April 2021 (UTC)

@Bruce1ee: I have granted you access to request reports at user:COIBot/Poke. Or if you use IRC, then connect to Freenode and #wikimedia-external-links and you can request reports and run some analytics (Small Wiki Monitoring Team/IRC).  — billinghurst sDrewth 13:53, 5 April 2021 (UTC)
Thank you billinghurst, that will help a lot. I'll be using COIBot/Poke as I don't use IRC. —Bruce1eetalk 14:00, 5 April 2021 (UTC)
@Bruce1ee and Billinghurst: I was earlier today planning to check whether you had that capability.   Done. --Dirk Beetstra T C (en: U, T) 14:16, 5 April 2021 (UTC)
Thanks. —Bruce1eetalk 14:31, 5 April 2021 (UTC)