User talk:Beetstra/Archives 2014

Latest comment: 10 years ago by Billinghurst in topic COIBot lost connection with meta

COIBot restarting "Selected additions" rather than appending

Gday, I am not sure whether it is intentional, in error, or a consequence of the database refresh, however, the Xwiki reports that are being generated are losing their selected additions history section, it all starts afresh from the recovery of the system. It is that middle section only, so it is survivable, just not ideal.  — billinghurst sDrewth 03:25, 4 January 2014 (UTC)

Yep, the old data is gone, and COIBot uses everything it has in the db. Just be aware that there may be data in the history of files. For the older stuff, one will have to dig 'manually' .. --Dirk Beetstra T C (en: U, T) 13:47, 4 January 2014 (UTC)

LinkWatchers turn to slack off

and they haven't come back. FWIW, I have gotten round to getting my tools account active so that you can train me a few more steps to look after the mundane.  — billinghurst sDrewth 13:40, 28 January 2014 (UTC)

Coren killed it, partially for something that I still had to solve, and a second part that I don't understand (the bots have been doing that for months without issues ..). I've solved the first part and started the bot again. It does give some additional problems that I have to solve at some point. One thing at a time. --Dirk Beetstra T C (en: U, T) 07:53, 29 January 2014 (UTC)
Totally understand, and in truth I am not looking for more work, just finally getting some things done, and letting you know.  — billinghurst sDrewth 09:52, 29 January 2014 (UTC)

COIBot stopped, presuming that it didn't migrate

Coren did the migration thingy for those bots that hadn't been moved by their owners (my) overnight. At or about that time, all the bots stopped working, so I am presuming that it didn't get moved by you previously. Anyway, a note for whenever you are about.  — billinghurst sDrewth 23:35, 17 March 2014 (UTC)

Sigh. What did they change this time. I am a volunteer, not a full time worker paid by them. They were running under their own accounts, but again servers needed to be changed. I start to wish for an own box again :-(. --Dirk Beetstra T C (en: U, T) 07:19, 18 March 2014 (UTC)
They moved from pmtpa to eqiad. All jobs were stopped and crontabs were removed. Can you restart COIBot please? Do you just need to run start.sh? PiRSquared17 (talk) 03:45, 28 March 2014 (UTC)
No the issue was (is?) the tools db was (is?) down. Beetstra needed to get Coren to do bits, and we are waiting.  — billinghurst sDrewth 10:44, 28 March 2014 (UTC)
Indeed, I do not have access to the db :-( .. no clue where it is now. --Dirk Beetstra T C (en: U, T) 08:53, 31 March 2014 (UTC)

COIBot not writing to meta

Gday Beetstra. COIBot apparently has stopped writing meta reports. They appear to be running fine, and being reported as running through IRC, however, when it comes time to write the report (and IRC says that it is being saved) there is no output. Thanks.  — billinghurst sDrewth 12:31, 4 June 2014 (UTC)

And again :-/  — billinghurst sDrewth 12:12, 6 June 2014 (UTC)
Grr .. gets logged out every now and then - even while the last session should last into July. No clue why. --Dirk Beetstra T C (en: U, T) 05:14, 8 June 2014 (UTC)
If there is a means for me to kick it from outside, please let me know.  — billinghurst sDrewth 15:09, 8 June 2014 (UTC)
Oh, and thanks for inside kicking  — billinghurst sDrewth 15:10, 8 June 2014 (UTC)

COIBot lost its write ability

It has stopped writing to Meta again. Thanks for whatever you can do, whenever you can do it. If this is a simple process to restart, maybe it is time for me to get to kick it.  — billinghurst sDrewth 01:56, 10 July 2014 (UTC)

No, it is process to login again .. I need to work on that at some point to make that easier. --Dirk Beetstra T C (en: U, T) 04:04, 10 July 2014 (UTC)

COIBot complaining about a lookaround regex

Gday. A note for when you next tickle the code of COIBot, it was complaining about the regex (?<!-)\bt\.co\b which from my checks seem legitimate.

COIBot> ALERT: http://meta.wikimedia.org/w/index.php?diff=9283437&oldid=9283384&rcid=5448608: Faulty regex ((?<!-)\bt\.co\b) inserted (Unmatched ) in regex; marked by <-- HERE in m/(?<!-) <-- HERE \bt\.co\b/ at coibot.pl line 2968, <PIDS> line 103.

Thanks.  — billinghurst sDrewth 12:53, 23 July 2014 (UTC)

Funny - I think there are problems with the perl vs. php versions of regexes. I'll have a look. I do think it is correct as well, it appears that it misses the opening (. --Dirk Beetstra T C (en: U, T) 07:41, 3 August 2014 (UTC)

Database access to linkwatcher_linklog for MerlBot

Hi, on dewiki i am running a bot called MerlBot that informs local wikiprojects about articles related to their topic needing attention (deletion requests, syntax problem and many many more). As a new feature i would like to inform projects about current spamblacklist discussions local on dewiki or on meta. For this i need a list of dewiki articles that are affected by black/whitelisting a domain. Of course i am using the externallinks database tables. But i also need to know which articles had this domain added in the past.

For this i tried to query your bot on irc. But this is very slow and had problems on big results. I also looked at your source code and found much overhead that is done on irc request but i do not need. Also i only need the most recent addition which speeds up the query execution time by adding a result limit. In your soure code i also found your db password (sorry) and so i looked in your table content. I did dome tests and direct access to your database tables would really fit my needs. So i would like to ask if you would grant me access to your tables.

My bot would query your database four times a day for all domains currently in a spamblacklist discussion. On my tests all the queries were finished less than one minute even for amazon.com and facebook.com which are currently discussed on dewiki. So it should not cause problems for your scripts by blocking or heavy load on your tables. My toolacount for reading external data is merlbot-read with db user name "s51826", so you could grant me acces using this command:

GRANT SELECT ON `s51230__linkwatcher`.`linkwatcher_linklog` TO 's51826'@'%'

Merlissimo (talk) 23:27, 30 July 2014 (UTC)

Hi. I think that is fine - I will try to do that when I can access the server. Alternatively, COIBot could follow and write the reports to the German Wiki as well, like it does for en.wikipedia. It would only need a bot-bit and use of the proper templates on the relevant discussion pages (it picks up the additions of '{{LinkSummary|<domain>}}' from certain pages on meta and en, extracts the domain and saves a report for it). --Dirk Beetstra T C (en: U, T) 07:38, 3 August 2014 (UTC)
Merlissimo - enabling COIBot on de.wikipedia would also save all 'local reports for possible spamming' on that wiki (basically, it would populate the tree under Category:COIBot Local Reports for de.wikipedia.org on the local wiki (as well as a tree for all cross-wiki reports, which may already, or in the future, affect de.wikipedia as well) - on en.wikipedia COIBot does the same, see en:Category:Local_COIBot_Reports (where the tree is mainly ignored ..). --Dirk Beetstra T C (en: U, T) 05:13, 5 August 2014 (UTC)
And the reports can be written in 'the local language' to a certain extend through the settings and the templates it calls. --Dirk Beetstra T C (en: U, T) 05:14, 5 August 2014 (UTC)
COIBot reports is not an alternativ to what i need for my bot because they are not updateded daily.
So COIBot reports on dewiki could only be an additional feature. I already talked with Lustiger seth about having these reports locally. I have not problem with running CIOBot on dewiki. But we both don't know if there would be a real benefit. There are only very few admins on dewiki maintaining the local spam-blacklist. seth is doing 80% of this work alone. And they all know how to use COIbot on irc. Merlissimo (talk) 10:37, 5 August 2014 (UTC)
The reports would be updated when the additions continue and on a regular basis. But it is fine. I hope I remember tomorrow to give you access. Do you need a structure-output as well? --Dirk Beetstra T C (en: U, T) 12:16, 5 August 2014 (UTC)
select distinct namespace, pagename, revid from s51230__linkwatcher.linkwatcher_linklog l where lang='de' and wikidomain='w' and l.domain = ? order by id desc limit 50 as subquery as case of domains and SELECT DISTINCT ns_id, REPLACE(pagename,' ','_') as page_title, revid from s51230__linkwatcher.linkwatcher_linklog l inner join s51892_toolserverdb_p.namespacename n on ns_name=namespace where lang='de' and wikidomain='w' and l.domain = ? AND (fullurl like ? OR fullurl like ? OR fullurl like ? OR fullurl like ?) AND n.dbname='dewiki_p' AND ns_id IN (0,1,4,100,101) order by id desc limit 50 for urls including a path are the only queries i currently would like to execute. Merlissimo (talk) 23:09, 5 August 2014 (UTC)
I've granted the access. You're aware that my 'domain' is 'sorted' backwards? 'www.somewhere.com' is stored as 'com.somewhere.www.' to increase search-speed? --Dirk Beetstra T C (en: U, T) 03:21, 6 August 2014 (UTC)
Of Course. And also "www" removed. fullurl is stored as normal. That is why in case of the second version which includes a path check i have four fullurls checks prepended with "http://" "https://" "http://www." and "https://www.". On dewiki_p.externallinks table i am querying el_index which is domain reversed + normal path. Thank you for granting access. I have already tested and it works fine. Merlissimo (talk) 08:32, 6 August 2014 (UTC)
I also took off 'www2.' and/or 'www3.'. Good to hear it all works, curious to see some output at some point (maybe it is something interesting I can also make COIBot work on). --Dirk Beetstra T C (en: U, T) 08:56, 6 August 2014 (UTC)
Have a look at the section marked with   at e.g. de:Wikipedia:Redaktion_Chemie/Infotafel2 or de:Wikipedia:Redaktion_Medizin/Qualitätssicherung and 280 mor pages.
These wikiprojects are informed by my bot about ongoing discussion about articles belonging to their subject area. Now they are also informed about spam-blacklist/whitelist discussions affecting at least one article of their subject area. Without your tool my bot wouldn't know articles which links were added in the past but already reverted.
E.g. a link is added to the article "tennis". This is reverted and a blacklist discussion is started. With the help of your data my bot now knows that this link was added to the article tennis in the past and can inform the wikiproject "sports" about this blacklist discussion. Merlissimo (talk) 09:20, 6 August 2014 (UTC)

From a pub in Itchenor

... waiting for the rain to stop so I can continue my walk. I see that COIBot is not in channel #wikimedia-external-links would you mind giving it a kick when you have a chance. Thanks.  — billinghurst sDrewth 14:04, 13 August 2014 (UTC)

Pff .. seems to be dead. I'll try to have a look this afternoon. Enjoy the beer! --Dirk Beetstra T C (en: U, T) 05:04, 14 August 2014 (UTC)
It was dead, old problem (it gets logged out, dies, and tries to restart which fails when it is not logged in). I really should work on that code a bit, time allowing. --Dirk Beetstra T C (en: U, T) 11:20, 14 August 2014 (UTC)
Again. She goes kaput, and not returns.  — billinghurst sDrewth 13:06, 14 September 2014 (UTC)
You're still in that pub.
Yeah, COIBot has login-problems, can not get the right data sometimes and then crashes on it. Don't really understand why. --Dirk Beetstra T C (en: U, T) 03:19, 15 September 2014 (UTC)

Superprotect letter update

Hi Beetstra,

Along with more hundreds of others, you recently signed Letter to Wikimedia Foundation: Superprotect and Media Viewer, which I wrote.

Today, we have 562 signatures here on Meta, and another 61 on change.org, for a total of 623 signatures. Volunteers have fully translated it into 16 languages, and begun other translations. This far exceeds my most optimistic hopes about how many might sign the letter -- I would have been pleased to gain 200 siguatures -- but new signatures continue to come.

I believe this is a significant moment for Wikimedia and Wikipedia. Very rarely have I seen large numbers of people from multiple language and project communities speak with a unified voice. As I understand it, we are unified in a desire for the Wikimedia Foundation to respect -- in actions, in addition to words -- the will of the community who has built the Wikimedia projects for the benefit of all humanity. I strongly believe it is possible to innovate and improve our software tools, together with the Wikimedia Foundation. But substantial changes are necessary in order for us to work together smoothly and productively. I believe this letter identifies important actions that will strongly support those changes.

Have you been discussing these issues in your local community? If so, I think we would all appreciate an update (on the letter's talk page) about how those discussions have gone, and what people are saying. If not, please be bold and start a discussoin on your Village Pump, or in any other venue your project uses -- and then leave a summary of what kind of response you get on the letter's talk page.

Finally, what do you think is the right time, and the right way, to deliver this letter? We could set a date, or establish a threshold of signatures. I have some ideas, but am open to suggestions.

Thank you for your engagement on this issue, and please stay in touch. -Pete F (talk) 18:14, 26 August 2014 (UTC)

Link watchers are out of action, and restart doesn't make a difference, nor is there any comment when it is told.  — billinghurst sDrewth 00:32, 5 October 2014 (UTC)

COIBot lost connection with meta

Same old.  — billinghurst sDrewth 08:59, 20 October 2014 (UTC)

Friggin losing it's login all the time. It should be logged-in for a month, but sometimes it .. decides to lose it. XLinkBot has the same issue. I'll monitor, regularly .. --Dirk Beetstra T C (en: U, T) 07:17, 21 October 2014 (UTC)
Argh, logged it in this morning, again dead .. :-( I'll look again tomorrow morning. Frustrating. --Dirk Beetstra T C (en: U, T) 12:30, 21 October 2014 (UTC)
You'll never guess. hmmm or maybe you'll never guess errr … guess what?!?  — billinghurst sDrewth 00:15, 2 November 2014 (UTC)
Well it is back by intervention, or time passing. However we now have XLinkBot complaining ... XLinkBot> I've become logged-out on Wikipedia, reverting disabled ... and I don't know if there is a means to kick it. If there is, I didn't find it.  — billinghurst sDrewth 12:59, 4 November 2014 (UTC)
COIBot seems finally stable for more than 24 hours .. I logged XLinkBot back in just now.
I start to think that there is something wrong on Labs - why do the bots get logged out so much. Coibot is the worst, but XLinkBot is also regularly gone (XLinkBot runs since March 18 now .. the bot itself is darn stable - that is another issue with COIBot, restarting regularly). I really need some time (which I do not have) to monitor the bots. --Dirk Beetstra T C (en: U, T) 03:18, 5 November 2014 (UTC)
I am wondering whether @MPelletier (WMF): can offer a suggestion.  — billinghurst sDrewth 05:19, 5 November 2014 (UTC)

YGM

— revimsg 09:23, 21 October 2014 (UTC)

Return to the user page of "Beetstra/Archives 2014".