Please do not post any new comments on this page. This is a discussion archive first created in May 2018, although the comments contained were likely posted before and after this date. See current discussion or the archives index.
Regex requested to be blacklisted: cafemom\.com/search
Regex requested to be blacklisted: gameinformer\.com/search
Both search links are being abused by spambots and are contemporaneous to a variety of spam links. Neither site has a clear need to be added as a search link at sister wikis. — billinghurstsDrewth 23:15, 22 May 2018 (UTC)
Numerous longstanding links to wikipedia's ancestor now connect to a domain-jacked webpage. Whether it is malicious or not, it is definitely dangerous to those who unsuspectingly click through. LeadSongDog (talk) 21:11, 1 May 2018 (UTC)
@LeadSongDog:ineffective management process as blacklisting does not fix the links or effect them in anyway, it only affects the addition. If addition isn't happening, then blacklisting is pointless. If there are bad/dangerous links, then they should be removed, or altered in line with the local wikis rules. — billinghurstsDrewth 21:24, 1 May 2018 (UTC)
Ok, so where are the rules here on Meta? LeadSongDog (talk) 21:33, 1 May 2018 (UTC)
(note to self) I should add this to the request for the upgrade of the spam blacklist - a capability to disable existing links and/or an option reinstate the capability that pages can not be saved when an offending link is there (the latter enforcing the removal). --Dirk BeetstraTC (en: U, T) 05:25, 2 May 2018 (UTC)
If there's evidence that the domain is now malicious I'd rather have it added here if it was a popular domain in the past to avoid being re-added. I also agree with Beetstra that SBL should deactivate existing external links to blacklisted domains where the link has not been removed from the page yet. —MarcoAurelio (talk) 14:17, 2 May 2018 (UTC)
@MarcoAurelio and Beetstra: Such suggestions directly contravene the current development of the blacklist and its tactical operation. It would also negate additions of urls that have been undertaken with temporary addition to whitelists. [It would mean that sites would have to permanently whitelist all sorts of urls.] I don't see that as a functional plan, as the downsides for not being able to save pages, and the forcing of the bulk removal of urls would be horrendous. That we don't have a good testing means for the addition of anything to the blacklist so any erroneous addition could be catastrophic in what it forces people to do to save, rather than just saving new additions. If we have problematic urls, blacklisting all pages with old urls is a sledgehammer to a walnut. There are more finessed means to resolve such a problem. If you want link removal/deactivation, that sounds like a job for a bot.
Re malicious urls, we monitor and if they are being added then we blacklist (situation normal). We have those tools already. — billinghurstsDrewth 22:25, 2 May 2018 (UTC)
@Billinghurst: with certain urls we would like to enforce a cleanup. Yes, it would annoy editors that they have to remove rubbish, but as we are talking an AbuseFilter-like system, custom messaging is an option. It is also not a function to be used lightly, only for properly malicious material. Note that I have been suggesting a much, much more chirurgical system than the crude system of now. —Dirk BeetstraTC (en: U, T) 03:39, 3 May 2018 (UTC)
We can write a global abuse filter now where we load "dangerous" urls (limitation on the "global" however) and alert users to remove. Issue is the direction to people, is it remove, find an archived version of the old url, or to null it. For Commons urls, we don't want to obliterate them. For references in articles, there is the requirement to keep them; for Wikidata, there is the requirement to deprecate rather than delete. One-size fits all is going to be difficult. — billinghurstsDrewth 06:23, 3 May 2018 (UTC)
For the last of that, that has been anticipated in my suggested rewrite of the spam blacklist extension. For references in articles, they should be replaced with a working archive link (to a past state of the site), or else replaced with an equivalent reference (and otherwise, they become unreferenced) - for references, the external link is by no means needed anyway, a proper reference can stand on itself without the external link ("Van Meter, K.J.; Van Cappellen, P.; Basu, N.B.; Science, Vol. 360, Issue 6387, pp. 427-430" is a perfectly valid reference in an article). Regarding WikiData, yes, we want to obliterate them as well - as they are transcluded on other wikis and extend the problem (or are not and then the transcluded official website cannot be saved), and they are damaging link in the first place (for those who go to WikiData and want to follow the links presented there. Damaging links should be removed without exception (and probably more on references than on anything else, I follow more references than external links in documents). --Dirk BeetstraTC (en: U, T) 07:08, 3 May 2018 (UTC)