Talk:Spam blacklist/Archives/2015-09

Add topic
Active discussions

Proposed additions

Symbol comment vote.svg This section is for completed requests that a website be blacklisted

placidway.com



See spam report. Site deals with medical tourism and has no value for Wikimedia projects. MER-C (talk) 11:40, 13 September 2015 (UTC)

Added Added --Herby talk thyme 11:57, 13 September 2015 (UTC)

Proposed removals

Symbol comment vote.svg This section is for archiving proposals that a website be unlisted.

jerseyusa.net



Collateral damage from some of the Chinese knockoff regexes -- whitelisting request. I'd rather have this addressed here. MER-C (talk) 12:29, 27 January 2015 (UTC)

The rule is 'jerseys?(mvp|-)?(nba|shops?|goods|whole|wholesale|soho|release|zones|sale|com|pick|cn|export|supply|trade|site|warehouse|stop|faves|4u|kk|cc|ab|usa|outlets?|clubhouse|only|buy|planet|911)\.(com|us|org|net)\b' (my bolding) - not sure how to exclude jerseyusa.net from such a complicated rule - except if we split it into 'jerseys?(mvp|-)?(nba|shops?|goods|whole|wholesale|soho|release|zones|sale|com|pick|cn|export|supply|trade|site|warehouse|stop|faves|4u|kk|cc|ab|outlets?|clubhouse|only|buy|planet|911)\.(com|us|org|net)\b' and 'jerseys?(mvp|-)?(nba|shops?|goods|whole|wholesale|soho|release|zones|sale|com|pick|cn|export|supply|trade|site|warehouse|stop|faves|4u|kk|cc|ab|usa|outlets?|clubhouse|only|buy|planet|911)\.(com|us|org)\b'. --Dirk Beetstra T C (en: U, T) 08:48, 28 January 2015 (UTC)
The whole thing gets regex'd anyway when the blacklist is applied, so just pull it out and do it separately.  — billinghurst sDrewth 05:37, 1 February 2015 (UTC)
Removed Removed addition  — billinghurst sDrewth 05:20, 8 September 2015 (UTC)

misericords.co.uk



Seems like an inoffensive, relatively scholarly site. I'd like to use it for the Peterborough Cathedral article. --Jtle515 (talk) 22:26, 8 August 2015 (UTC)

It was spammed, and if you look at the XWiki link, you will see the history. I don't have a particular issue removing it, as it is an old block.  — billinghurst sDrewth 04:40, 9 August 2015 (UTC)
Removed Removed  — billinghurst sDrewth 05:11, 8 September 2015 (UTC)

== Bergspider.net

fluoridealert.org



This site offers good independent information about the fluorine and fluoride additives to water supplies. It was blacklisted 5 years ago and it was not logged properly. I do not have any conflict of interest. It does not contain spam. It supplies unbiased references and primary sources for information not otherwise obtainable by English speaking countries. Please remove from blacklist, thank you.— The preceding unsigned comment was added by RibsNY (talk)

Not here,   Not done. Please see the local blacklist on the wiki where you are coming from. --Dirk Beetstra T C (en: U, T) 10:01, 16 August 2015 (UTC)
  Declined  — billinghurst sDrewth 05:14, 8 September 2015 (UTC)

squidoo.com



We acquired the Squidoo.com domain over a year ago and it's no longer hosting any content. We have aggressively cleaned up remnants such as this and would appreciate it being removed from the blacklist. We will ensure that activities with this domain aren't associated with spamming in the future. I work for the corporation that now owns the domain. — The preceding unsigned comment was added by Pauledmondson (talk)

It turns out the new owner is... Hubpages, which is blacklisted on en.wikipedia for exactly the same reason Squidoo is blacklisted here. MER-C (talk) 02:47, 26 August 2015 (UTC)
@MER-C: while that is the case, is there an impediment to removal of the domain, and putting it onto COIBot's monitor list? That way we can re-add pretty quickly if it becomes a problem  — billinghurst sDrewth 03:14, 26 August 2015 (UTC)
@MER-C and SDrewth: .. while wasting volunteers time and leading them into frustration for having to remove this stuff? This was blacklisted for a reason, I would say that a criterion must therefore be that the focus of the site will change drastically, not just that the site is cleaned up. This is not spam from the company side of the spectrum, it is spam for many editors abusing it's content.
@Pauledmondson: - is the focus of the site now completely different from the old squidoo.com, and completely different from hubpages.com? --Dirk Beetstra T C (en: U, T) 03:26, 26 August 2015 (UTC)

We have no plans to host content on Squidoo.com at this time. Also, if you're not familiar with the evolution of HubPages, there is now a pretty extensive editorial oversite (it's no longer like youtube) and the overall site is getting fact checked and edited by a team of in-house professional editors. We wouldn't make the request for Squidoo if we felt there was a risk that spammy content could return to it. The last thing we want is a return to the spam list. It's been over a year since we acquired the domain. We waited to submit this request until all the content was either unpublished or moved. I can certainly appreciate that the mods at wikipedia don't want to deal with spam. We have a considerable amount of our resources dedicated to spam fighting as well and it's frustrating to deal with spam day in and out. I can also appreciate the necessity of blacklists for how challenging it is to keep a large site like wikipedia that is constantly under attack spam free - I think we all strive to do as well as wikipedia. Thanks again for the consideration. (I apologize if I didn't follow the correct syntax for replying)

  Removed AGF  — billinghurst sDrewth 05:21, 8 September 2015 (UTC)
@Pauledmondson: Hmm, already the domain is being used to spam, as while there is no links left, the spammers don't know, or don't care. I am going to flick up the monitoring and see what may be the alternatives to manage outside of the blacklist.  — billinghurst sDrewth 15:01, 9 September 2015 (UTC)

lulu.com



This is just a book publishing website. I don't see why I can't link to a book from Lulu. --WandaRichards 14:17, 27 August 2015 (UTC)

  Declined @WandaRichards: It is not globally blacklisted, though I note that it is locally blacklisted at English Wikipedia. You are going to need to take your query to w:en:Mediawiki talk:Spam-blacklist  — billinghurst sDrewth 05:24, 8 September 2015 (UTC)

check-and-secure.com





We are the owner of the webpages check-and-secure.com and we can't reference to it, due to a regex-entry "d-secure.com" in wikimedia's meta-blacklist. Please add our domain to a whitelist or rewrite the regex-pattern to something less impactive.— The preceding unsigned comment was added by 87.155.46.93 (talk)

  Done addition  — billinghurst sDrewth 05:11, 8 September 2015 (UTC)
@Billinghurst: - maybe the addition should be '(?!-an)' instead of \b .. IIRC, there was a reason why there was not a \b there. --Dirk Beetstra T C (en: U, T) 05:54, 8 September 2015 (UTC)
hmm .. maybe not, I see that there were many additions done at the same time without the initial \b .. maybe we should just throw off that whole list. --Dirk Beetstra T C (en: U, T) 06:01, 8 September 2015 (UTC)
It was a low-doc addition, so I went on the plain vanilla issue resolution (mixed metaphors?), and will follow that process if it recurs. I didn't see the need to be more or less responsive though have complete trust in any of your actions.  — billinghurst sDrewth 06:45, 8 September 2015 (UTC)

uservoice.com



uservoice.com is blacklisted as a URL shortener, however it is not a URL shortener, but a feedback site for manny software companies products, specifically Microsoft. I believe it should be removed from the blacklist as there is no point to this. Wolf GuySB (talk) 15:45, 5 September 2015 (UTC)

  Removed though to note that if it is abused, or not used appropriately to the sites as a credible source then it may reappear.  — billinghurst sDrewth 05:26, 8 September 2015 (UTC)

infosecinstitute.com



I've found myself wanting to use this a couple of times but it's filtered, I can't see any reason for this. Deku-shrub (talk) 21:28, 5 September 2015 (UTC)

  Declined, not here, please check local blacklisting on your wiki. --AldNonymousBicara? 22:00, 5 September 2015 (UTC)
Thanks Deku-shrub (talk) 09:03, 6 September 2015 (UTC)

panicattackaway.com



This site contain helpful information for people who need help with stress,Panic attacks and anxiety, and it has any thing spammy on it—The preceding unsigned comment was added by 196.205.207.86 (talk)

  Declined, {{not here}], but on en.wikipedia per this. Dirk Beetstra T C (en: U, T) 16:47, 28 September 2015 (UTC)

letstalkpayments.com



The website is doing indepth research on blockchain technology and as a fintech lover, I wanted to add some insights from this website into the blockchain article. This information about non financial use cases of blockchain is unique to this website. However, whenever I try to cite this link, it is blacklisted. Can it be removed?— The preceding unsigned comment was added by 2601:646:1:b3f1:f15f:246f:d986:f35e (talk)

  Declined, Template:Not here but on en.wikipedia. Dirk Beetstra T C (en: U, T) 16:48, 28 September 2015 (UTC)


Troubleshooting and problems

  This section is for archiving Troubleshooting and problems.

Discussion

  This section is for archiving Discussions.

url shorteners

Hi!
IMHO the url shorteners should be grouped in one section, because they are a special group of urls that need a special treatment. A url shortener should not be removed from sbl unless the domain is dead, even if it has not been used for spamming, right? -- seth (talk) 22:11, 28 September 2015 (UTC)

That would be beneficial to have them in a section. Problem is, most of them are added by script, and are hence just put at the bottom. --Dirk Beetstra T C (en: U, T) 04:51, 4 October 2015 (UTC)
Maybe it would seem more preferable to have "spam blacklist" be a compilation file, made of files one of which would be "spam blacklist.shorteners"  — billinghurst sDrewth 12:15, 24 December 2015 (UTC)
This seems like a nice idea. Would certainly help with cleaning up of it (which we don't do nowadays). IIRC, it is technically possible to have different spam blacklist pages so this is technically possible, just needs a agreement among us and someone to do it. --Glaisher (talk) 12:17, 24 December 2015 (UTC)

@Beetstra, Lustiger seth, Glaisher, Vituzzu, MarcoAurelio, Hoo man, and Legoktm: and others. What are your thoughts on a concatenation of files as described above. If we have a level of agreement, then we can work out the means to an outcome.  — billinghurst sDrewth 12:39, 25 January 2016 (UTC)

  • I am somewhat in favour of this - split the list into a couple of sublists - one for url-shorteners, one for 'general terms' (mainly at the top of the list currently), and the regular list. It would however need an adaptation of the blacklist script (I've done something similar for en.wikipedia (a choice of blacklisting or revertlisting for each link), I could give that hack a try here, time permitting). AFAIK the extension in the software is capable of handling this. Also, it would be beneficial for the cleanout work, that the blacklist itself is 'sectioned' into years. Although being 8 years old is by no means a reason to expect that the spammers are not here anymore (I have two cases on en.wikipedia that are older than that), we do tend to be more lenient with the old stuff. (on the other hand .. why bother .. the benefits are mostly on our side so we don't accidentally remove stuff that should be solved by other means). --Dirk Beetstra T C (en: U, T) 13:05, 25 January 2016 (UTC)
Is it really possible to have different spam blacklist pages? What would happen to the sites that use this very list to block unwanted spam? —MarcoAurelio 14:23, 25 January 2016 (UTC)
It is technically possible. But this would mean that if we move all the URL shortener entries to a new page, all sites using it currently would have to update the extension or explicitly add the new blacklist to their config or these links would be allowed on their sites (and notifying all these wikis about this breaking change is next to impossible). Another issue I see is that a new blacklist file means there would be a separate network request on cache miss so their might be a little delay in page saves (but I'm not sure whether this delay would be a noticeable delay). --Glaisher (talk) 15:38, 25 January 2016 (UTC)
Hi!
Before we activate such a feature, we should update some scripts that don't know anything about sbl subpages yet.
Apart from that I don't think that a sectioning into years would be of much use. One can use the (manual) log for this. A subject-oriented sectioning could be of more use, but this would also be more difficult for us. -- seth (talk) 20:49, 27 January 2016 (UTC)
Another list for shorters would be a good idea. Also, a bunch of years ago Hoo wrote a script to find (and remove) expired domains. --Vituzzu (talk) 14:10, 22 February 2017 (UTC)

Before developers spend a lot of time on this, I would really prefer that they spend their time to completely overhaul the whole blacklist system, and make it more edit-filter like (though not with the heavy overhead of the interpretation mechanism, just plain regexes like the blacklist, but different rules for different regexes. --Dirk Beetstra T C (en: U, T) 10:08, 23 February 2017 (UTC)

This section was archived on a request by: —MarcoAurelio (talk) 11:13, 26 February 2018 (UTC)
Return to "Spam blacklist/Archives/2015-09" page.