Tech/Archives/2012

< Tech‎ | Archives

Liverpool Collegiate School page - Len Horridge photographs fail to display on iPad

The three photographs below by Len Horridge which accompany the Wikipedia page Liverpool Collegiate School are no longer showing up on an iPad since being moved to Wikimedia Commons. They originally displayed fine and still display OK on iPhone and desktop machines, it's just the iPad where these three photos fail to show. I tried the page on two Pads, it doesn't seem to be a browser cache issue. The photos DO appear on the individual file pages, but the small images on the Liverpool Collegiate School article page are failing. Anyone tell me why?

Collegiate_after_cleaning_1973.jpg

Collegiate_hall_as_walled_garden.jpg

Collegiate_hall_firedamaged.jpg Keithbates51 09:11, 22 January 2012 (UTC)

Thanks for reporting this, the problem occurred because those images were deleted from the English Wikipedia as they were available on our sister project, Wikimedia Commons. Doing a hard refresh of the page to clear the server's cache (also known as a purge), by navigating to this link fixed the problem. To read more about purging, please see en:WP:PURGE. Hope this helps, The Helpful One 14:35, 22 January 2012 (UTC)

ManyThanks! Keithbates51 21:01, 22 January 2012 (UTC)

Question: NOINDEX

Does __NOINDEX__ work? -- Proofreader77 (talk) 20:14, 19 February 2012 (UTC)

Yes, it works on any page that's not defined as a content namespace. On Meta-Wiki, that means it'll work everywhere except in the main (unprefixed) namespace. Why do you ask? --MZMcBride (talk) 01:17, 20 February 2012 (UTC)
  • I vaguely remember a discussion about it not actually doing anything (at some point in the past in some context).
  • I see it doesn't cause a change to the rendered html page.
  • And it doesn't seem to add it to a robots.txt file.
I recently suggested adding it to a contentious RfC, and someone did that ... but just want to make sure that I hadn't suggested something that actually does nothing, as per above. :-) -- Proofreader77 (talk) 01:34, 20 February 2012 (UTC)
It does change the rendered HTML page. That's how it works. Look at the <meta> tags before and after. There will be an additional one if you use "NOINDEX" (norobots or something). Similarly there will be an extra meta tag if you use "INDEX". --MZMcBride (talk) 01:39, 20 February 2012 (UTC)
I am somehow overlooking it (in source) to this. (My html knowledge is a bit stale. Nothing jumps out, but may be in some included file ... Studying.) -- 01:55, 20 February 2012 (UTC)
That page is in the main (unprefixed) namespace. That namespace is defined as a content namespace, so "NOINDEX" won't work there. This is a security feature to prevent vandalism (e.g., someone removing w:Barack Obama from search engine indices). --MZMcBride (talk) 01:57, 20 February 2012 (UTC)
Thanks. (So yes it won't work there.) Note: When I suggested hiding from search, I was not suggesting __NOINDEX__ but rather something else that could be done; e.g., adding line to robots.txt. The person adding that code apparently thought that's what I meant. I wasn't sure that wouldn't work, so I didn't say anything ... but I (finally) realized I'd better check specifically, since it was done (apparently) due to my suggestion to hide from search. Thanks again. Proofreader77 (talk) 02:35, 20 February 2012 (UTC)
MediaWiki:Robots.txt controls each individual project's robots.txt file. The global version of robots.txt is printed above the robots.txt file on each project, e.g. <https://meta.wikimedia.org/robots.txt>. It doesn't really matter which de-indexing tool you use, once the content has gotten into search engines, it's very difficult to get out. robots.txt and "NOINDEX" work best when applied as the page is created. Otherwise, they're questionably useful. --MZMcBride (talk) 02:39, 20 February 2012 (UTC)
I solved that problem once upon a time in a specific case. (There's a trick to how you have to specify it to Google so it is accepted. yada yada yada) There may not be a solution for this case, of course. Again, thanks. Proofreader77 (talk) 02:50, 20 February 2012 (UTC)

Listen template on meta (403 error)

Is there a special reason the "Listen" template on meta doesn't produce a cute play button like on en.wikipedia (Can meta one be "upgraded" etc?) And why does the online player link give a 403 error about some expired user account? (I'd email them, but don't why I need to. :-) -- Proofreader77 (talk) 05:40, 27 February 2012 (UTC)

It was old and relied on an old tools.de hack that was now retired due to a closed account. I have copied over a version from English Wikisource. billinghurst sDrewth 09:42, 27 February 2012 (UTC)
Change of plan, too much other baggage would need to comwe over and this is generally not the place to store and display that media, I will see if I can just update the hack. billinghurst sDrewth 09:49, 27 February 2012 (UTC)
Whole thing seems problematic across namespace. Just putting an ogg file will get it to work fine, and if you want it formatted stick it in a table.
A Christmas Carol stave 5.
Not sure what is wrong with the template that seems a tad temperamental, though not really worth throwing a lot of effort at it. billinghurst sDrewth 10:06, 27 February 2012 (UTC)
  • Thank you, billinghurst! Very much appreciate your time and careful attention to this. Certainly agree this is not a priority item, but I will perhaps back-burner ponder further the "problematic across namespace" issue. -- Proofreader77 (talk) 20:31, 27 February 2012 (UTC)

XML Datei nicht abwärtskompatibel?

Seit kurzem ist es nicht mehr möglich XML Dateien aus der deutschsprachigen Wikipedia in ältere MediaWiki Versionen (1.16.5) zu importieren. Woran liegt das und besteht die Möglichkeit da an der Import.php was zu ändern? Die Fehlermeldung lautet:

Warning: xml_parse() [function.xml-parse]: Unable to call handler in_() in /home/marjorie-wikiorg/public_html/literatur/w/includes/Import.php on line 437

Falls ich hier falsch bin, sag ich schonmal sorry. LG Lady Whistler   (Project Other Wikis) 11:36, 6 March 2012 (UTC)

Ombox seems to be broken. Perhaps the new version of MediaWiki did this? Killiondude (talk) 04:54, 11 March 2012 (UTC)

I guess you didn't notice this thread. Too busy voting? --MZMcBride (talk) 05:53, 11 March 2012 (UTC)
I am aware of that unanswered thread. I thought this page might be a better place to bring it up. :) Killiondude (talk) 06:56, 11 March 2012 (UTC)

Can you time-link into .ogv files

Can you time-link into .ogv files (like in youtube you can append e.g., #t=1m20s to link to 1 minute and 20 seconds into a video)? -- Proofreader77 (talk) 20:10, 11 March 2012 (UTC)

Yes and no. bugzilla:26663 is related to this. There appears to be a working script that does this [1], but it was super slow for me on FF and didn't work on Chrome. Test it for yourself here. You could attempt to get community support for it to be included in the MediaWiki:Common.js file of whatever wiki you want to use it on. Or you could put it in your own .js file but the time link would only work for you and whoever else has it installed. I do have a sneaky suspicion that MZ will correct me if I'm wrong. :) Killiondude (talk) 04:35, 12 March 2012 (UTC)
Mdale did put a gadget at enWS (Add mwEmbed support for improved Video Playback.), though I cannot say that I have played with it, have a chat to s:en:User:Theornamentalist as they were the person play with it. If you have a file at Commons, then it can be hosted with a transcription at enWS. billinghurst sDrewth 04:56, 12 March 2012 (UTC)
  • Many (many!) thanks. Beautifully precise answer(s) — even if "the answer" is not as simple as one might wish (as many answers in the realm of multimedia/compatibility tend to be).

    Note that the specific video file that inspired my question is File:Monthly_Metrics_Meeting_March_1,_2012.ogv (1h 28m 30s), and was hoping there was an easy way to link to specific parts (if for no other purpose than to refresh my own memory on a key point or two.) And yes, I just crashed Chrome :-) ... will continue playing (and chatting in direction aimed). -- Proofreader77 (talk) 05:33, 12 March 2012 (UTC)

Query Limit from mobile or web Client

Hi, I am working on R&D for a new way to browse CC-licensed content on wikipedia.
Suppose there are 10.000 users with their mobile phones browsing the wiki in the same time, via API through the client we are developing (a mobile app).
I want to know how to design the app in order to guarantee people can browse it without delays or "dead queries".


We have a client (mobile/web) which contains the names of the objects to be populated with W content.

For example, to see the content of the movie "Blade Runner", I'd like to use exact query to display, say, the plot paragraph via wikipedia api.
Is there a limit of queries which can be done via api?
The same concept should be applied to the whole wiki, in this case I simply would like to call the whole article and restyle it.
Again, is a there a limit for this?
Finally, I'd like to share the profit with wikipedia foundation. Who should we contact for? Thank you! The preceding unsigned comment was added by Gg4u (talk • contribs) .

Hi.
The English Wikipedia's API is available at <https://en.wikipedia.org/w/api.php>. All Wikimedia wikis have an API that can be easily utilized at their respective domains.
Regarding API limits, you're free to hit the API at a reasonable rate. Unfortunately I don't have an exact figure I can give you for requests/second or requests/minute. Just be reasonable.
Due to the architecture of Wikimedia's community (way more readers than editors), the HTML cache layer (in Squid) is much stronger than the API. If it's possible to use easily the rendered and stored HTML, I would. (It also comes with the benefit of being parsed for you.) But using the API is also fine if that's what your application needs.
The most important piece of writing a tool or script of this nature is to use a descriptive User-Agent, one that includes contact information for you in case you need to be reached regarding issues with your tool or script. Read the linked page for more information.
For donations, contact donate wikimedia.org. That should get you routed to the appropriate Wikimedia staffer, though I'll ask someone else to double-check this.
Hope that helps. --MZMcBride (talk) 03:09, 15 March 2012 (UTC)
Yep, that's the right address. Thanks, MZ. Philippe (WMF) (talk) 03:19, 15 March 2012 (UTC)

Quick follow-up: there's been some mailing list discussion about this topic. Original thread; click through the "next message" links to read replies, particularly this one and this one. mw:API:Etiquette will also likely be useful to you. --MZMcBride (talk) 22:33, 15 March 2012 (UTC)

Global things

Hi! Didn't know where to ask so choosed to ask here. Watching in Recent changes page on other wikis I saw that reverters (I think that they are from "Small Wiki Monitoring Team") used en:Wikipedia:Twinkle (but normally on that wiki that isn't available). So is Twinkle (and maybe some other gadgets, for example HotCat) available for global use? --Edgars2007 (talk) 16:43, 26 March 2012 (UTC)

Hi. Global gadgets do not currently exist (bugzilla:20153).
What most people do to have a script in multiple places is keep a central repository (at Meta-Wiki or Commons or their home wiki or wherever) and then import that script into their /common.js or /skinname.js subpage on the projects where they work. For example, my central script repo is at User:MZMcBride/global.js and I import that script into other wikis such as commons:User:MZMcBride/monobook.js and wmf:User:MZMcBride/monobook.js. The benefit to using /common.js (as opposed to /skinname.js) is that /common.js will work with any skin, should you choose to change later. (Personally, I think I'll always use Monobook.) Hope that helps. --MZMcBride (talk) 03:06, 27 March 2012 (UTC)

Adding a section to the "Special characters" portion of WikiEditor toolbar

Hi. Using the WikiEditor toolbar, I'm trying to add a (test) section to the "Special characters" portion of the editing toolbar. The code I'm trying is here: testwiki:User:MZMcBride/extrachars.js. I tried following the documentation, but to no avail. Anyone know? --MZMcBride (talk) 22:43, 30 May 2012 (UTC)

Your code works fine on Google Chrome's console, if executed after the characters menu is already visible (but nothing happens if it is executed while the section is hidden).
See also this topic: mw:Extension_talk:WikiEditor/Toolbar_customization#Adding_characters. Helder 02:48, 31 May 2012 (UTC)
You see a section called "TEST" in the list among "Latin", "Arabic", etc.? I don't. The goal is to have working code in the browser, not some console. :-) --MZMcBride (talk) 02:56, 31 May 2012 (UTC)
Yep! I see it (only if the list of special character is open when the code is executed).
You could try this alternative "customizeToolbar" (based on Danmichaelo's code):
var customizeToolbar = function() {
        $('#wpTextbox1').bind('wikiEditor-toolbar-buildSection-characters', function (event, section) {
                section.pages.TEST = {
                        'layout': 'characters',
                        'label': 'Emoticons',
                        'characters': [':)', ':))', ':(', '<3', ';)']
                };
        });
};
It seems to work even if the section is collapsed. Helder 03:44, 31 May 2012 (UTC)
Cool, thanks. :-)
I'm still a bit curious why my code doesn't work as expected and why the documentation in this area is so poor, but those are separate matters. --MZMcBride (talk) 17:52, 31 May 2012 (UTC)


Restrictions on Upload page

Few Indic language wikipedians have asked a feature (for the respective language wiki) in Special:Upload page to make sure that the users are selecting a license and type a description before uploading images. Do we have a feature in mediawiki to support this? I have seen similar feature in Commons where some of the actions are mandatory. Is this is a Gadget/extension/something else?--Shiju Alex (WMF) (talk) 11:27, 5 June 2012 (UTC)

Perhaps you mean the UploadWizard? Regards, Tbayer (WMF) (talk) 23:50, 12 June 2012 (UTC)

Help regarding an analysis tool!

I am looking for a particular kind of analysis for a set of usernames of a date range (e.g. Jan 2012 to May 2012). The input will be a set of usernames (say 40 usernames) and it searches for the following queries and gives output in a .CSV format. I have documented my queries here. Can anyone help me, please? --Subhashish Panigrahi (talk) 11:00, 6 June 2012 (UTC)

I looked at <https://wiki.toolserver.org/view/~Psubhashish/tool>. I don't see which (MySQL?) queries you wish to run. If you can give the queries and which replicated databases you'd like the queries run against (e.g., "enwiki_p"), I can run them. --MZMcBride (talk) 04:07, 8 June 2012 (UTC)
Thanks a lot! The query would be < Join of wiki activities of set of users with a specific timeline > --Subhashish Panigrahi (talk) 07:58, 12 June 2012 (UTC)

Checkboxes at Special:UserRights — Order changed for no reason?

Tracked in Phabricator:
Bug 37584

Hello. I don't know if it's a software issue or not so posting here instead of bugzilla. If you see the tickboxes at Special:UserRights "CheckUser" which was usually in the middle of the list now appears at the very bottom of the page. It's not a dissasterous issue but I'm just wondering why this is happening. Please compare with File:Userrights.png. Regards. —Marco Aurelio (audiencia) 15:50, 8 June 2012 (UTC)

Hmm, strange. The order I see currently is:
  • bot
  • administrator
  • bureaucrat
  • steward
  • account creator
  • importer
  • transwiki importer
  • oversight
  • confirmed user
  • CheckUser
This is an arbitrary order, as far as I can tell. Definitely a bug. Please file a bug in Bugzilla when you have a chance. :-) The list should be alphabetical or something else that's sensible. --MZMcBride (talk) 18:14, 8 June 2012 (UTC)
Thanks. Probably just "CheckUser" needs to be moved up as it was in the past. I'll pass it to bugzilla. Regards. —Marco Aurelio (audiencia) 20:32, 9 June 2012 (UTC)
Done at bugzilla:37584 with CC to you. — MA (audiencia) 09:38, 14 June 2012 (UTC)

HTML5 in Wikipedia

Hey, guys. We need a Working Draft for an HTML5 version of Wikipedia. (Get it?) As some users may know, HTML5 is a new version of the HyperText Markup Language designed with the DOM, AJAX, and competition with Adobe Flash in mind. Each time the development team has attempted to introduce HTML5 to the MediaWiki software, it broke several scripts and bots, so they dropped those changes. For more information about the HTML5 revolution, click here. To test whether your browser is compatible with HTML5, click here.

Here are some main points from the main discussion at WP Village Dump:

  • Wikipedia pages should be divided into semantic elements that a regular viewer wouldn't care about, but that would make processing by AI simpler. Examples of such tags include article (for the body of an article), nav (for navigation elements and category links), and section (for sections of Wikipedia articles).
  • Inline MathML could be used rather than PNG to display equations and formulae in articles, in order to save bandwidth and allow easier processing. For example, MathML equations could be copied and pasted into documents, in whole and in part, much more easily than PNG equations.
  • Editors should be encouraged to use appropriate tags for emphasis, instead of just bold or italic. This is a W3C recommendation and, while not necessary, could be used to improve the machine-readability of featured articles. (Not everyone agreed to this, but it's worth discussing.)

There are also a few things I haven't discussed yet, such as Microdata and WYSIWYG editing. Any other ideas? Seriously, don't hesitate to propose or discuss them! 68.173.113.106 ("The Doctahedron") 20:56, 9 June 2012 (UTC)

(Note: the above IP editor is actually an experienced regularly-editing editor on enwiki with a static IP). HTML5 was tried before but broke a lot of things, currently not at highest priority. On WYSIWYG, we should have one by the end of summer. MathML was tried but developers were too lazy to get a working interpreter.--Jasper Deng (talk) 21:07, 9 June 2012 (UTC)
That's what we're trying to discuss. Of course, we could test stuff out at TestWiki without danger of it breaking. Possible domain: html5.test.wikipedia.org 68.173.113.106 18:38, 10 June 2012 (UTC)
Rather, html5.wmflabs.org (Labs).--Jasper Deng (talk) 18:39, 10 June 2012 (UTC)
For the record, the request to enable HTML5 is marked as one of the "1.18 post-deployment actions", and since we are already using the MW 1.20 on this wiki, I think it is about time to schedule it to be enabled. Also, I would say some things are already broken, and turning on HTML5 just make that more obvious. Helder 22:34, 10 June 2012 (UTC)

Native version of MediaWiki?

I was wondering. Would rewriting MediaWiki in a C-based programming language, integrating the database and Web server directly into the application, make it run any faster? (This is especially a problem on Wikia sites where I keep getting 324 Errors.) 68.173.113.106 22:00, 6 July 2012 (UTC)

In theory, yes, but there'd be a lot of time and effort needed to re-write MediaWiki and all relevant extensions in C. Take a look at w:HipHop_for_PHP for a "short cut". Also, aren't Wikia in the midst of a migration? So things breaking isn't unexpected. Reedy (talk) 22:09, 6 July 2012 (UTC)
Migration? My main idea was that the concept of a database change vector (used in Oracle software, probably integrated in MySQL) would be applied to the actual wiki pages so a database would be unnecessary except to map page names to oldid's. Theoretically, you could just look up pages by oldid. 68.173.113.106 23:28, 7 July 2012 (UTC)
There's a false dilemma here. The options are not exclusively rewrite all of MediaWiki (probably well over half a million lines of code) or have a slow shitty wiki on Wikia. If Wikia's hosting is giving you 324 (empty response) errors, that sounds like a problem with Wikia's hosting. It could also be a few other pieces (are you using Chrome, by chance?). In any case, MediaWiki is fast enough if you have a cache layer and don't do some of the stupid shit that Wikia does. If you value your wiki and your data, the right answer is probably just getting better hosting and using MediaWiki as-is rather than attempting to rewrite it. --MZMcBride (talk) 23:37, 7 July 2012 (UTC)
Migration/upgrade. See MediaWiki Upgrade. Happy reading. Reedy (talk) 23:39, 7 July 2012 (UTC)
I've learned a few things about database management and it turns out that usually the problem isn't the software, it's that the database is slow. And yeah, I use Chrome. 68.173.113.106 15:16, 15 July 2012 (UTC)

Computer modern font for SVG

Tracked in Phabricator:
Bug 38299 resolved as fixed

Is it possible to install computer modern font-the one used by LaTeX renderings  - and make it available in SVG images. This would add consistency to articles that use equations and illustrations explaining them(articles about geometrical proofs).--Gauravjuvekar (talk) 06:33, 8 July 2012 (UTC)

Sure. You'll just need to figure out which font you want (it should be freely licensed) and file a ticket in Bugzilla to have it installed on the SVG rendering servers. The current SVG fonts are listed at SVG fonts, I think. --MZMcBride (talk) 03:02, 11 July 2012 (UTC)

Added--Gauravjuvekar (talk) 07:59, 11 July 2012 (UTC)

tables: header options are ignored

Source Result
{|class="wikitable" style="width:15em"
!align="right" bgcolor="yellow"| text
|-
|align="right" bgcolor="yellow"| text
|-
!style="text-align:right;background:yellow"| text
|}
text
text
text

When specifying wiki-options like align= or bgcolor= in header cells (!), these options are ignored, although they work in plain cells (|). Also, in heeder cells still works CSS-styles. AVB (talk) 10:27, 13 April 2012 (UTC)

tables: wrong processing width=

MW incorrectly treats width="nnem" as width="nnpx". Example:

Code Result
width="30px"
T
width="30em"
T
style="width:30em"
T

AVB (talk) 10:34, 13 April 2012 (UTC)

In the HTML spec, "width" accepts a plain number. If you want to specify a unit, you should use CSS. Werdna (talk) 21:20, 23 July 2012 (UTC)

inconsistend align= behavior for class="wikitable"

align="right" attribute accepted in table header, whereas align="center" is ignored:

class="wikitable" align="right"
class="wikitable" align="center"

When class not specified (or differs from "wikitable") align="center" works fine:

align="center"

AVB (talk) 13:30, 19 April 2012 (UTC)

This is not a bug, but a simple cause-and-effect from using deprecated methods like the align= attribute. align=center is interpreted by browsers as adding automatic CSS margin left and right, which works fine (as we can see in the second sample without "wikitable"). The "wikitable" class (as any CSS class) is just a group CSS rules being applies to an element of choice. Among other rules, "wikitable" contains (and should contain, by design) a rule for margin. And as the "class" has precedence over obsolete attributes like "align", it overrides that. This is not a bug nor MediaWiki related. This is how (most) browsers are designed to work, but it can be confusing indeed when they meet in the same element like this.
If you really need to override the margin from the class attribute, use an inline style attribute that sets the margin accordingly (e.g. style="margin-left: auto; margin-right: auto;") instead of the align="center".
class="wikitable" style
Krinkletalk 15:28, 15 July 2012 (UTC)
  • that sets the margin accordingly - unfortunately, this is contrintuitive and makes easier to drop class="wikitable" and continue to use other classess (like class="standard") or <center> tags around table. AVB (talk) 08:17, 1 August 2012 (UTC)

Special:WantedPages on Uncyc outdated

When I go to Special:WantedPages on Uncyclopedia, there are still links to articles that already exist, for example "UnPoetia:Main Page". On Wikipedia, they would be shown as blue links with strikeouts, however on Uncyc they are all redlinks. Is this a bug? 68.173.113.106 00:03, 21 July 2012 (UTC)

It's a Wikia bug from their most recent upgrade IIRC. Ajraddatz (Talk) 00:10, 21 July 2012 (UTC)
Even on Wikipedia, why don't the blue links go away after a while? 68.173.113.106 21:12, 23 July 2012 (UTC)

missing feature: compact tables

Currently, parser ignores cells after row separator (|-), this forces to create very sparse and inconvenient text. Compare:

Current status Wanted Result
{| class="wikitable"
| 1 || 1
|- |ignored
| 2 || 2
|- ||ignored
| 3 || 3
|-
| 4 || 4
|-
| 5 || 5
|-
| 6 || 6
|-
| 7 || 7
|-
| 8 || 8
|-
| 9 || 9
|-
| 10 || 10
|}
{| class="wikitable"
|- || 1 || 1
|- || 2 || 2
|- || 3 || 3
|- || 4 || 4
|- || 5 || 5
|- || 6 || 6
|- || 7 || 7
|- || 8 || 8
|- || 9 || 9
|- || 10 || 10
|}
1 1
2 2
3 3
4 4
5 5
6 6
7 7
8 8
9 9
10 10

AVB (talk) 08:28, 1 August 2012 (UTC)

missing feature: "else" for empty template-variables

Current syntax of using template-variables allows to replace missing argument by default value ({{{var|default value}}}, but not empty argument. This forces to use heavyweight constructions with #if. Typical example:

{{Infobox
| name = Abracadabra
| logo text =
| developer = Someone
|}

In template you should use next construction to process variables (in given example used PAGENAME as default for parameter):

{{#if: {{{logo text|}}} | {{{logo text|}}} | {{PAGENAME}} }}

Or even more complex code (in given example supported parameter variations like "logo text", "Logo text", "logo_text" and "Logo_text"):

{{#if: {{{logo text|}}} | {{{logo text|}}} | {{#if: {{{Logo text|}}} | {{{Logo text|}}} | {{#if: {{{logo_text|}}} | {{{logo_text|}}} | {{#if: {{{Logo_text|}}} | {{{Logo_text|}}} }} }} }} }}

Possible solution: extend defaul-syntax something like:

{{{logo text||{{PAGENAME}} }}}
{{{logo text||Logo text||logo_text||Logo_text|}}}

Here "||" mean use given value when "parameter not used OR empty". Probably, also can be supported syntax like {{{var|def1|def2}}}, but this is another story. AVB (talk) 08:46, 1 August 2012 (UTC)

This will likely not be a problem any more with mw:Lua scripting. guillom 15:47, 1 August 2012 (UTC)
  • 1. This is another tool, not related to current scripting, and presenting it not solves reported problem. 2. It not present yet. AVB (talk) 20:34, 3 August 2012 (UTC)
You can normalize template parameters by using the built-in {{lc:}} function. This won't solve the underscore normalization issue, though. --MZMcBride (talk) 16:24, 1 August 2012 (UTC)
  • Of course. And note, that "parameter normalization" is only one application, there is also other cases, when checking for emptyness should be performed. To be precise, I think, current semantic of {{{var|def}}} is too often useless. AVB (talk) 20:34, 3 August 2012 (UTC)
You can use {{{logo text|{{{Logo text|{{{logo_text|{{{Logo_text| {{PAGENAME}} }}}}}}}}}}}} Ruslik (talk) 17:57, 1 August 2012 (UTC)
  • This is incorrect code: it not gives PAGENAME for empty parameter. Also, it will not check other parameters if "logo text" is empty. AVB (talk) 20:34, 3 August 2012 (UTC)

Vector skin

Does Wikimedia distribute the Vector skin at all? 68.173.113.106 18:35, 1 August 2012 (UTC)

The Vector skin is part of MediaWiki and is distributed with all versions of MediaWiki since 1.16 IIRC. Some of the related user interface enhancements are in the Vector and WikiEditor extensions, which are distributed separately. --Catrope (talk) 18:46, 1 August 2012 (UTC)

User-Agent

Please see Talk:User-Agent_policy#Posting. Regards. — MA (audiencia) 09:34, 4 August 2012 (UTC)

Talk pages like that should probably be redirected here.... --MZMcBride (talk) 22:03, 4 August 2012 (UTC)

$wgSimpleFlaggedRevsUI

Hi. I'd like to know what this variable do. We're currently discussing at eswb how to configurate flaggedrevs. I've tryied to find the answer myself with no success. Thanks. — MA (audiencia) 10:05, 4 August 2012 (UTC)

From mw:Extension:FlaggedRevs#User interface:
$wgSimpleFlaggedRevsUI – When enabled, a simpler, icon based UI is used. Does not affect the tags shown in edit mode.
So it sounds like instead of using text in the user interface, FlaggedRevs uses icons. --MZMcBride (talk) 20:35, 4 August 2012 (UTC)

Two broken javascripts

Hi. These are on a private wiki so I've copied the scripts below. The first one stopped working when we switched from monobook to vector and I can't get it to work at all.

The second one stopped working at the same time, but still works with the monobook skin (or when &useskin=monobook is added to the URL).

Thanks for any help. Rjd0060 (talk) 21:09, 10 September 2012 (UTC)

Does the first script work if you disbale "Enable enhanced editing toolbar" and "Enable dialogs for inserting links, tables and more" in preferences? Ruslik (talk) 09:15, 11 September 2012 (UTC)
Yes, it seems to work just fine that way. Interesting. Can we get it to somehow work with the toolbar as well? If not, this is good enough, in my opinion. Rjd0060 (talk) 23:15, 11 September 2012 (UTC)
Ok, I created User:Ruslik0/otrstoolbar.js that adds the same two buttons to the new toolbar. Ruslik (talk) 08:51, 12 September 2012 (UTC)
That one works fine. Any ideas on the other? Rjd0060 (talk) 23:00, 12 September 2012 (UTC)

Works for me, have you made sure the link to "templates" doesn't appear? (You have to hover the arrow at the top right, the one holding the links to edit, delete, etc.) - Hoo man (talk) 23:22, 12 September 2012 (UTC)

I confirm that it works without any problems. Ruslik (talk) 08:05, 13 September 2012 (UTC)

Backup of deleted revisions check

Hi. At Meta:Deletion policy it says that Meta does not back-up deleted revisions. I know that in the past it was true but I'd like to know if nowadays that is still true. Regards, -- MarcoAurelio (talk) 08:33, 14 September 2012 (UTC)

There's a dump of the deleted pages and revision history listed here QU TalkQu 09:56, 14 September 2012 (UTC)
Hmm, deleted revisions are excluded from the backup process on dumps.wikimedia.org as the deleted revisions may contain sensitive data which we don't want to publish. --Hydriz (talk) 10:27, 14 September 2012 (UTC)
It seems database dumps would not help resolving this question, then. -- MarcoAurelio (talk) 23:07, 14 September 2012 (UTC)

Any JavaScript gurus want to improve the project portals?

If anyone is bored: <https://meta.wikimedia.org/w/index.php?title=Talk:Project_portals&oldid=4158191#Add_search_suggestions_to_wiki-project_portals>. It should be fairly easy to do. --MZMcBride (talk) 23:51, 24 September 2012 (UTC)

Sidebar link script

Hi. Not sure when this stopped working, but it did. It should cause the sidebar link (n-responses) to link to a specific section of the page (#En:...). It just goes to the page as usual, currently.


addOnloadHook(function() {
    document.getElementById("n-responses").getElementsByTagName("a")[0].href += "#en:_English_English";
});

Thanks! Rjd0060 (talk) 00:13, 4 October 2012 (UTC)

What project are you talking about? Ruslik (talk) 11:04, 4 October 2012 (UTC)
Hi. This relates to otrswiki: again. Rjd0060 (talk) 14:52, 5 October 2012 (UTC)
Look at the rendered page in your Web browser. Find the part of the page where you want this link to lead to. Then view the page's HTML source in your Web browser. Do an in-browser search (ctrl-f or cmd-f) for the relevant part of the page where you want this link to lead. For example, if it's a header called == English ==, find the relevant text string "English" in the HTML page source. Please paste this relevant portion of HTML below and it should be trivial to fix your issue. --MZMcBride (talk) 00:56, 6 October 2012 (UTC)
<h3><span class="mw-headline" id="en:_English_English">en: <a href="/wiki/Response:En" title="Response:En">English English</a></span></h3>

Thanks Rjd0060 (talk) 01:47, 6 October 2012 (UTC)

Is there an element with "n-responses" 'id'? Ruslik (talk) 07:09, 6 October 2012 (UTC)

Try this:

$(document).ready( function() {
   var target = $('#n-responses a').attr('href');
   $('#n-responses a').attr('href', target+'#en:_English_English');
});

I imagine there are more elegant ways to do this, but this should work. --MZMcBride (talk) 01:27, 8 October 2012 (UTC)

Works good. Thanks! Rjd0060 (talk) 01:51, 8 October 2012 (UTC)

Server response times

Not sure whether this is the right place for this, but here goes: recently i filled the Research:Wikipedia Editor Survey 2012, and i didn't remember then to mention the "waiting for bits.wikimedia.org…" and "waiting for upload.wikimedia.org…" annoyances. It seems that the servers have a difficult time serving everyone at the same time, which of course i can empathise with, but what're those "bits" and "upload" thingies, when i'm not uploading anything, or just reloading a page after a quick category update, as most of the content is in my browser cache already? I'm frustrated and i should've mentioned in the survey that it does discourage me from contributing. Where is the right place to bitch about it and be heard? Thanks in advance. --Jerome Charles Potts (talk) 22:01, 5 November 2012 (UTC)

Any images, videos or other files viewed come from upload.wikimedia.org. Static content - javascript, CSS and some images are served from bits. Reedy (talk) 22:04, 5 November 2012 (UTC)
Could you provide examples (means: exact steps to reproduce for somebody else) when exactly these "waiting for" messages are shown when they shouldn't be there? --Malyacko (talk) 13:06, 6 November 2012 (UTC)
It seems that the problem was on my end: i had many open browser windows, each with many tabs. I finally closed as many as i could let go, and relaunched Google Chrome (in Windows XP). I was originally sorting media in Commons using HotCat, and each change was painfully long. Eventually i noticed quite a bit of "thrashing", with "waiting for cache…" notices. Things are much better today. I still see the "waiting for bits…", but hardly have time to notice the "waiting for upload…" one, since my updates of Commons happen so fast now. --Jerome Charles Potts (talk) 19:53, 16 November 2012 (UTC)

Commons database report for duplicate files

Is it possible to get a Commons database report for duplicate files, similar to the one at Wikipedia? It would ideally run daily. Reedy suggested the following SQL query on IRC:

[15:26] <Reedy> select count(*) as cnt, img_sha1 from image group by img_sha1 having cnt > 1;

Thanks in advance! Logan Talk Contributions 20:28, 17 November 2012 (UTC)

Artikel-Feedback-Tool-Cats in InitialiseSettings.php

Hello,
we have a question over here in the german wp: We plan to introduce the Article-Feedback-Tool and a question rises: Should we add ~260 categories directly to InitialiseSettings.php (see "wmgArticleFeedbackv5Categories") or add approx. 11.000 to one big category to activate the AFT on that articles. I think the first one will not only mess up the file but also greatly increase the size (all the cats will be in it) of [2] which is included in every site loaded. Second one will only generate a lot of edits. Something I miss? How should we do it?--Se4598 (talk) 21:03, 19 November 2012 (UTC)

Be aware that the translation might not be the best possible. I suggest you to proofread it (see also mw:Help:Extension:Translate/Quality assurance) before you enable it, for a smoother transition (only few messages have been checked so far). This page also lists some particularly problematic messages you might want to triple-check; others are those containing "oversight", see bugzilla:39282/bugzilla:35026; and most importantly bugzilla:39578/bugzilla:38349 has not been fixed yet, among others. --Nemo 21:32, 19 November 2012 (UTC)

New namespace here on Meta

Crossposting from Wikimedia Forum#New namespace on Meta. Steven Walling (WMF) • talk 23:17, 29 November 2012 (UTC)

So wiki CSS

Hi. A user from the so wiki has been leaving requests on my en wiki user talk page for help in adding CSS to the so wiki. He is trying to add templates like Navboxes and Infoboxes to the wiki, but doesn't understand that the CSS is missing. He needs w:en:MediaWiki:Common.css copied over to w:so:MediaWiki:Common.css. I am not sure how he found me, but I don't have the ability to edit this files on the so wiki. --Odie5533 (talk) 10:46, 15 December 2012 (UTC)

  Done, see your enwiki talk. - Hoo man (talk) 15:18, 15 December 2012 (UTC)