Talk:HTTPS
Organisation of the content page
editHere are the thoughts which drived my writing/expansion of the content page.
I follow wikitech-l and wikitech-ambassadors since many years (less for the latter :) and I have difficulties to track the various topics. I’m interested in the HTTPS topic and I find both the technical and community people should be involved in this question, hence a job for a wikitech ambassador.
I find this page should be a portal/hub -- still better it is presently -- explaining the general topic, the various questions involved (better than only the keywords presently), the past and coming deployments, and a collection of pointers about the state of the discussion (wikitech-l and bugzilla mainly for this topic).
Any thoughts about this organisation?
(Perhaps we can use this section -- with subsections -- to discuss specifically about the organisation, and let the other sections about true HTTPS questions.)
~ Seb35 [^_^] 09:34, 18 August 2013 (UTC)
- Be bold. :-) --MZMcBride (talk) 16:40, 18 August 2013 (UTC)
- Hi Seb35. Thanks for the dump. I am a little concerned, though, that people who are pointed to this page due to the planned Wednesday deploy of SecureLogin by default won't know where to begin; it's a little text heavy without much "what is going on, what this means for you" in simple and brief language. I hope you don't mind, but I'm going to Be bold and drastically rearrange this page to address the most common questions and conerns from users who are having problems with HTTPS. I'm going to keep much of what you wrote, just re-arrange, mostly. Thanks again! Greg (WMF) (talk) 22:49, 19 August 2013 (UTC)
- Hi, thanks a lot for the rewritting, it’s better now with these simple explanations. I removed my paragraph about PRISM since you better summarised it in the introduction; I hesitated about adding a link, but it could distract the reader. ~ Seb35 [^_^] 12:14, 20 August 2013 (UTC)
Soft HSTS (HTTP Strict-Transport-Security)
editI wrote an extension for MediaWiki to enable HSTS on a per-user basis. The rationale behind is to let the privacy-conscious users to opt-in to HTTPS-everytime (for all compatible browsers) independently of their current connected or unconnected status, and at the same time not force the other users to use HTTPS (particularly if they use to live in a HTTPS-unfriendly environment as the China).
I find such an extension would be worth to be installed on the Wikimedia projects (in the first times on the test wikis and then the MediaWiki wiki) to balance between the advantages and drawbacks of HTTPS, as well as slowly increase the HTTPS load of the servers. HSTS coming with a high constraint (fatal error in case of bad TLS/HTTPS connection), I find it could be tested with increasing expiration times (1 hour; 1 day; 2 weeks; 1 month) with some volunteers during the next year to test if all works correctly (e.g. the users will obligatorily complain if their old browser’s cipher suite or SSL 3.0 is removed).
~ Seb35 [^_^] 17:03, 18 August 2013 (UTC)
- I misunderstood the deployment planned for August 21, 2013; I thought it was only the log-in page. This extension becomes almost useless if all HTTP traffic is 301-redirected to HTTPS, or it could be tweaked to remove the user preference and be activated for all logged-in users in order to remove the 301-redirect-over-HTTP step which can still be MITMed/spied. ~ Seb35 [^_^] 18:18, 18 August 2013 (UTC)
- And the extension is now updated to give the server administrator the possibility to force anonymous and/or logged-in users to have a STS header, or to let the logged-in users choose if they want one. ~ Seb35 [^_^] 20:06, 18 August 2013 (UTC)
Load script from home wiki
editEn.wiki is my homewiki, and in all other projects I have this in my vector.js:mw.loader.load( '//en.wikipedia.org/w/index.php?title=User:Edokter/MenuTabsToggle.js&action=raw&ctype=text/javascript' );
This fails in modern browsers as a XSS-violation when in HTTPS. Are there any provisions where the several Wikimedia domains are trusted with regard to eachother? — Edokter (talk) — 09:56, 20 August 2013 (UTC)
- I don’t know well XSS, but this syntax works for me (Opera 12.16, I have it in my en.wp common.js). Else I see on mw:ResourceLoader/Migration guide (users)#Migrating user scripts it is recommanded to keep importScript instead of switching to mw.loader to avoid problems; perhaps you can try with importScript. Else what browser+version do you have? ~ Seb35 [^_^] 12:41, 20 August 2013 (UTC)
- I see en:User:Edokter/MenuTabsToggle.js itself calls
mw.loader.load
with 'http' explicitly specified, and Firefox 23.0 here complains about the mixed content due to that. Try making that protocol-relative as well? Anomie (talk) 13:47, 20 August 2013 (UTC)
- A bug in Chrome (or was it IE?) prevents use of protocol-relative stylesheets in the .load method. XSS stands for 'Cross-site scripting'. Most moders browsers block loading scripts form another domain when you are on a secure connection. The only way to allow secure cross-site scripts to load is by whitelisting those domains in the certificate. That is what my question partains to: does our certificate has such a whitelist? — Edokter (talk) — 20:47, 20 August 2013 (UTC)
- Independently of XSS, about the HTTP/HTTPS in the scripts and problems with protocol-relative, you can use some script like
mw.loader.load( window.location.protocol+'//en.wikipedia.org/w/...' );
- (ref) ~ Seb35 [^_^] 22:03, 20 August 2013 (UTC)
- Seb35's solution is what I most commonly see for addressing this issue, so that's probably a good solution in this case. Edokter, the enwiki certificate has most of the other domains in the Subject Alternative Name. I personally haven't encountered the issue with the browsers I user, but I'll see if I can reproduce if you let me know what you're using. CSteipp (talk) 22:42, 26 August 2013 (UTC)
- HTTPS has just kicked in and to my surprise, all my scripts seem to load from en.wiki just fine. — Edokter (talk) — 22:09, 28 August 2013 (UTC)
- Seb35's solution is what I most commonly see for addressing this issue, so that's probably a good solution in this case. Edokter, the enwiki certificate has most of the other domains in the Subject Alternative Name. I personally haven't encountered the issue with the browsers I user, but I'll see if I can reproduce if you let me know what you're using. CSteipp (talk) 22:42, 26 August 2013 (UTC)
- A bug in Chrome (or was it IE?) prevents use of protocol-relative stylesheets in the .load method. XSS stands for 'Cross-site scripting'. Most moders browsers block loading scripts form another domain when you are on a secure connection. The only way to allow secure cross-site scripts to load is by whitelisting those domains in the certificate. That is what my question partains to: does our certificate has such a whitelist? — Edokter (talk) — 20:47, 20 August 2013 (UTC)
Excluded language
editPlease add ckb wikipedia to excluded language. All sysops of ckb.wiki are from Iran and we can not access to wikimedia projects via HTTPS.--Calak (talk) 19:01, 20 August 2013 (UTC)
- What is the exact list of the excluded languages: is it only
zh
andfa
? Or are there also the other Chinese languages:yue/zh-yue
,wuu
,gan
,cdo
,nan/zh-min-nan
? +those of Iran? Is there a gerrit patch about the config? ~ Seb35 [^_^] 20:25, 20 August 2013 (UTC) - Please see bug 52846, these wikis mostly used in Iran and have same problem: fa, ckb, mzn, glk, ku. I don't know any thing about Chinese languages.--Calak (talk) 20:34, 20 August 2013 (UTC)
- The current plan is to exclude based on the IP/location of the user, not the language wiki. This means that users from eg China will be able to login to English Wikipedia over HTTP if needed. Greg (WMF) (talk) 23:40, 21 August 2013 (UTC)
update of automated editing tools
editAre pywikipediabot, autowikibrowser, etc, now https compliant? It would be extremely disruptive to implement this change without automated editing tools being able to accommodate it. DavidLeighEllis (talk) 19:24, 20 August 2013 (UTC)
- According to this message (just sent), pywikipediabot was updated for HTTPS and the pywikipediabot users must update their code with the last version. ~ Seb35 [^_^] 21:27, 20 August 2013 (UTC)
- Please let me know (preferably via IRC or email, greg wikimedia.org) if you see any tools that are not ready as soon as possible. Thanks! Greg (WMF) (talk) 23:42, 21 August 2013 (UTC)
Commons, Wikidata and SUL
editWon't this create lots of trouble for people in China and Iran who need to upload an image to Commons or modify an interwiki link on Wikidata?
- Commons: You must be logged in to upload an image. If HTTPS is unavailable and mandatory, then you can't log in.
- Wikidata: It is possible to edit links without logging in, so I suppose that you can edit as an IP using HTTP? This however reveals your IP for everyone to see. Editing over HTTP also reveals your IP address, but only to anyone who is wiretapping you at the same time as you are accessing the Internet.
- SUL: Logs you in automatically if you visit another project. Doesn't this mean that Chinese and Iranian users will have to log out in order to read an article in another language? This could be very inconvenient if you at the same time need to be logged in on Chinese or Iranian Wikipedia.
You can presumably access HTTPS by using VPN or TOR and you can disable login on selected projects by changing your cookie settings. However, lots of users probably don't know how to do this and the result will instead be that logging in will be disabled. I am personally able to use HTTPS and have HTTPS Everywhere installed, but I understand that lack of unencrypted HTTP for logged in users may cause lots of problems to users in some countries. --Stefan2 (talk) 19:49, 20 August 2013 (UTC)
- The current plan is to redirect based on the IP/location of the user, not based on language wiki. This should alleviate this issue as we'll whitelist users from China and Iran initially, along with potentially some locations that have a hard time using HTTPS. Greg (WMF) (talk) 23:44, 21 August 2013 (UTC)
HTTPS problem
editHello. Now we can't access to all WM projects (except Meta, Commons and Incubator)!--Calak (talk) 19:54, 20 August 2013 (UTC)
- I guess Meta, Commons, and Incubator should all be excluded from the forced https until the "GeoIP" functions are available, in order not to cause any problems there for Iranian or Chinese users who e.g. want to complain here on Meta about https problems, are active with uploading files on Commons, or active in test-projects in their language on Incubator. --MF-W 20:01, 20 August 2013 (UTC)
Now Bugzilla, MediaWiki and Wikidata have not any problem, but all WM projects are out of access. Now I am sysop on ckb wikipedia an I can't open even one page! What are we suppose to do?--Calak (talk) 20:07, 20 August 2013 (UTC)
- Please see the "Help!" section of the subject-space page. IRC is fastest. :-) --MZMcBride (talk) 21:23, 20 August 2013 (UTC)
- I'm confused, Calek, can you honestly not access "even on page" *right now*? We haven't made any changes yet. Greg (WMF) (talk) 23:45, 21 August 2013 (UTC)
why Chinese languages are excluded?
editI think Chinese wikipedian community has made it clear that all Chinese language(Classical Chinees, Wuu chinese etc) should not be convert into https version, but here I see nothing. can somebody tell me why please? as A classical Chinese editor, I will no longer be able to do my work if it is true that Meta forget their promises.
Benjamin Jiperus (talk) 23:04, 20 August 2013 (UTC)
- See also bugzilla:52846#c12 and try to get a complete list of the languages. --Stefan2 (talk) 23:12, 20 August 2013 (UTC)
The Chinese community has given a list, on that list, it shows what kind of Chinese language should not be https, that also include Tibetan, another Sino-Tibetan language that will be influenced. because most of the native speakers are located in China where HTTPs isn't an easy option. we have already got restrictions from Communist party. https is no help to us. ----Benjamin Jiperus (talk) 23:16, 20 August 2013 (UTC)
Leave choice to users
editPlease let users choose which transport they want to use. --Purodha Blissenbach (talk) 23:06, 20 August 2013 (UTC)
- This. It will be complete crap if I will be forced to edit through HTTPS, it's slow on my instable connection. And FYI, forcing users should be only last resort, and I see no reason to change to HTTPS voluntarily. KPu3uC B Poccuu (talk) 02:32, 21 August 2013 (UTC)
Please leave the choice to user, this is so wrong to force people to use HTTPS if they want to stay connected. --PierreSelim (talk) 05:19, 21 August 2013 (UTC)
- You can turn off HTTPS in the preferences. As of now I don’t see it, I guess it will be deployed in some hours at the same time HTTPS will become mandatory. ~ Seb35 [^_^] 06:33, 21 August 2013 (UTC)
- Keep in mind that that will only disable the secure browsing and secure editing features; log-in remains only HTTPS even if that preference is selected. Risker (talk) 08:12, 21 August 2013 (UTC)
- That should be enough for my JavaScript bot and for web-debugging. I'd prefer if I don't have to make my browser trust the Fiddler certificate. -- Rillke (talk) 08:11, 22 August 2013 (UTC)
- Keep in mind that that will only disable the secure browsing and secure editing features; log-in remains only HTTPS even if that preference is selected. Risker (talk) 08:12, 21 August 2013 (UTC)
Is there anyway to make the opting out of HTTPS a global preference? I tend to go to the foreign language wikis to get more extensive articles about subjects of greater import to that language community, and I'm having to manually go to the preferences page the first time I visit them after the change and to opt out of HTTPS for that wiki. That has been and remains my biggest gripe about the HTTPS switchover. Carolina wren (talk) 02:31, 12 September 2013 (UTC)
Translation
editHi. Why neither this page nor global message delivery post isn't/wasn't translatable? Do you think everybody on the planet is at least en-2? Althought I try to AGF but I can't consider it another way than not respect to local not English communities who are not obliged to learn English. WMF should never do that way again. --Base (talk) 23:26, 20 August 2013 (UTC)
- Basically this is now possible in a manual way, although I’m personaly relunctant to use the Translated extension because the page is not stable and would ask frequent re-translations, but if someone wants to set up it, just do. I asked the Meta admins to change the link in the banners (I would be very happy if someone could do that!). I sent a message to the translators-l mailing list about this translation. ~ Seb35 [^_^] 13:02, 21 August 2013 (UTC)
List of Chinese Wikipedia
editPlease Notice these languages also need to remain http.
- Classical Chinese(lzh)
- Wuu Chinese(wuu)
- Gan Chinese(Gan)
- Hakka Chinese(Hak)
- Cantonese(zh-yue)
- Tibetan(bo)
- Minnan Chinese(zh-min-nan)
Benjamin Jiperus (talk) 23:28, 20 August 2013 (UTC)
Sorry there is two more languages:
Uyghur are Turkish language, a majority of Uighur people lives in the west part of China.
Thank you
— The preceding unsigned comment was added by Benjamin Jiperus (talk) 23:49, 20 August 2013 (UTC)
- I opened the bug 53144 for tracking this problem and wrote the list you gave (+zh :) ~ Seb35 [^_^] 07:39, 21 August 2013 (UTC)
Perfect forward secrecy
editWill the new https-including setup have the Perfect forward secrecy property? If not, it would be nice to have. There should be an option to prioritize Diffie-Hellman for key-exchange to achieve this. Wishing you good results for the procedure, good to see that coming. --Methossant (talk) 23:35, 20 August 2013 (UTC)
- According to "The future of HTTPS on Wikimedia projects" PFS will not be enabled soon, but it is in the future plans. From the some things I read about PFS this blog post is interesting (activation of PFS by Google) and this one says PFS only cost 15% more in computer time (ECDHE version, because the classical DHE is much more expensive -- 300% more in compute time). But PFS is not completely supported by all modern browsers: IE support lacks, Opera only have DHE; Chromium and Firefox have full support. ~ Seb35 [^_^] 07:05, 21 August 2013 (UTC)
- If you have any news in this regard, you want to upgrade to something like quantum cryptography? João bonomo (talk) 20:06, 11 September 2013 (UTC)
Unable to edit Wikidata
editIt says to report trouble here.
I have not been able to edit Wikidata while logged in for the past couple of days.
The [edit] buttons all appear initially, then they vanish.
Varlaam (talk) 00:36, 21 August 2013 (UTC)
- Hi Varlaam, I doubt this is due to the HTTPS (the topic of this Talk: page) as nothing has changed yet :-). I've reported it to the WikiData team in this bug report. Greg (WMF) (talk) 05:07, 22 August 2013 (UTC)
- Greg, thanks for pointing this our way. Varlaam, can you give a bit more info? Browser used? Operating system? Can you try switching off your gadgets and see if there is some interplay? --denny (talk) 09:30, 22 August 2013 (UTC)
- I agree with Greg that it shouldn't be the HTTPS. You try disabling all your gadgets, then re-enable each individually to try to figure out if any is the problem. I remember some time ago a script caused a similar issue for me. If this doesn't help (if you still have the issue with all gadgets off, even after clearing your browser's cache, the extra information Denny asked for might be of some help. Hazard-SJ ✈ 23:25, 22 August 2013 (UTC)
- Greg, thanks for pointing this our way. Varlaam, can you give a bit more info? Browser used? Operating system? Can you try switching off your gadgets and see if there is some interplay? --denny (talk) 09:30, 22 August 2013 (UTC)
Copy editing notes
edit- Please add a link to some formal statement or roadmap or approved plan that says "[t]he continued work to improve the security and privacy of Wikimedia project users is a stated goal of the Wikimedia Foundation". (I'm not doubting it's true, but a footnote/reference helps.)
- It might also be useful to link to some internal or external site that confirms the use of Wikipedia as a site useful for tracking.
- I recall some discussion on how images would be handled with HTTPS enabled, but wasn't technically competent enough to follow the conversation. Are there any issues with images that will change the effectiveness of the HTTPS login and browsing? If so, and since we have a huge number of non-Commons pages with images on them, it would be appropriate to mention this.
- Also appropriate to mention any other situations where HTTPS may not be as effective as expected.
Risker (talk) 05:29, 21 August 2013 (UTC)
- 1. no idea about that
- 2. added a Jimmy Wales’ tweet with an image (perhaps the journal article is better, don’t know)
- 3 and 4. I guess you are speaking about the traffic analysis, allowing an attacker to guess the page you are reading by collecting the sizes of the pages, images, and ressources from all pages and comparing it afterwards to some specific request you made; if so, I think it is too technical and specific for being worth to be mentioned here (but it’s because of this fact I wrote in the introduction "you can have a reasonable expectation that nobody knows what pages you are visiting").
- ~ Seb35 [^_^] 13:22, 21 August 2013 (UTC)
A request of resolution
editHere is all I know(forgive my English):
- Chinese Wikipedian have send a list of Chinese language(that are not suitable for https) while the meeting was still going 2 weeks ago.I wasn't at the meeting, people who was there told me Wikimedia Foundation would take into account Chinese languages(for example Wuu, Cantonese and Classical.etc)
- Now people are telling me except Chinese(standard), other dialects and Classical Chinese will still use https.
- this is not what we expected.
- I hope what is promised(if there is a promise made by Wikimedia Foundation) will be carried out.
- If I am wrong about what I say, please tell me the actual situation of converting into https version.
thank you----Benjamin Jiperus (talk) 09:34, 21 August 2013 (UTC)
- Hi Benjamin. Sorry for the confusion. The current status, as described on HTTPS, is that users from countries where HTTPS is not an option (specifically, China and Iran) will not be redirected to HTTPS but instead will stay on HTTP. Thus, if you are a Chinese user access English Wikipedia, you will still be able to login and view everything because you will be going over HTTP (as you're computer is based in China). Hope that clarifies, let me know if it doesn't. Greg (WMF) (talk) 20:18, 27 August 2013 (UTC)
Which documents?
editThe preamble of this article mentions: "The Wikipedia project was referenced in several documents as a specific source for tracking users' behavior online". Which documents are referenced here? Was it discussed somewhere? -- Ace111 (talk) 13:31, 21 August 2013 (UTC)
- This was raised by Risker two sections above; I just added a link to one document I know, but I don’t know others. ~ Seb35 [^_^] 13:37, 21 August 2013 (UTC)
- Jimbo's twitter post refers to a NSA slide from Guardian with a question: "Why are we interested in HTTP?" and an answer "Because nearly everything a typical user does on the Internet uses HTTP", followed by examples of web sites such as Facebook, Yahoo, Twitter, myspace, CNN, mail.ru, Wikipedia, Google and Gmail. I don't think this is a proper reference for Wikipedia as a specific source for tracking users' behavior. -- Ace111 (talk) 14:55, 21 August 2013 (UTC)
- I remembered some people in the discussions (on wikitech-l probably) with a similar opinion. I don’t have real opinion myself if this constitutes an evidence or not; but it’s the only thing I know that people refered for this claim, but obviously perhaps there are others I missed. ~ Seb35 [^_^] 22:19, 21 August 2013 (UTC)
- Jimbo's twitter post refers to a NSA slide from Guardian with a question: "Why are we interested in HTTP?" and an answer "Because nearly everything a typical user does on the Internet uses HTTP", followed by examples of web sites such as Facebook, Yahoo, Twitter, myspace, CNN, mail.ru, Wikipedia, Google and Gmail. I don't think this is a proper reference for Wikipedia as a specific source for tracking users' behavior. -- Ace111 (talk) 14:55, 21 August 2013 (UTC)
privacy concerns with images embedded in pages, served by Commons
editIs HTTPS fully supported as well for images hosted on Commons (if Commons does not have HTTPS support for now), when these images (most often thumbnailed) are embedded in Wikipedia pages ? Can you warranty that noone will know which Wikipedia you are consulting (including in Talk pages) if they embed these images in an unique way ? It is easy to deduce which pages you are reading if you can track the images embedded in them. Commons should then support HTTPS, including on the images server (but there's a huge scalability problem because Commons is extremely large and the thumbnail image servers have lots of work to do, probably much more than wikis themselves).
Also a good thing to do would be to mutualize ALL WMF wikis on the same base domain, so that we could use the same logon domain and remain in it, even when navigating on foreign wikis (it would require only a new design for paths; e.g. allowing transparent remapping of
- http(s)://(projectlangcode).(wikipedia).org/wiki/(pagename)
to
- http(s)://www.wikimedia.org/(wikipedia)/(lang)/wiki/(pagename)
One bug advantage would be the unification of security, simplification of navigation. In fact we would just connect via HTTP or HTTPS to a single proxy with a single URL:
and then all navigation would occur within the same domain within the same session, to request any number of pages or images, from any wiki of the foundation. only within this encrypted HTTPS session would the legacy URLs be requested. This proxy domain would connect directly to any one of the existing Squid proxies, forwarding internally the requests to the appropriate project wiki and image servers (or other services, including Worpress blogs, IRC channels, submission APIs, etc. that are compatible with HTTP proxies).
But the users won't see now any change in their address bar that will only show
during their navigation across pages, projects and services. Even when they will right click a link, they will be show anonymized URLs encrypted with their current secure session keys (e.g. https://go.wikimedia.org/(base-64 encoded encrypted string containing the service/project/language/page?get parameters). Note that the proxy delivering an HTML wiki page could transform as well these links using short URLs specific to the session, to make the base-64 strings short (the proxy would cache the URLs from the original page to build short URLs specific to the session and not comparable across users, using a simple SHA1 digest of the full URL, encrypted with the private session key, so that the delivered HTML pages would only contain relative links like "/(short base64 string)")
The base-64 encoding is suggested, but a larger encoding alphabet could make the string even a bit shorter, similar to those strings used on services like TinyURL. As well the non-encrupted string could be cached internally with a mapping to make them even shorter (based on usage statistics) for frequently accessed URLs. The encryption of these short URLs would still remain private and specific to each user session.
Note that this encyption of URLs does not require to deliver any encryption key to the client, it is purely internal to the proxy that manages the encryption/devyption keys internally. Also this encryption will not increase the length of short URLs.
verdy_p (talk) 19:51, 21 August 2013 (UTC)
- Commons and upload.wikimedia.org (where images are actually served from) support HTTPS. All of our projects have fully supported HTTPS for quite some time now.--Ryan Lane (WMF) (talk) 01:08, 29 August 2013 (UTC)
Loss of browser history
editI just had cause to use Internet Explorer and it seems as though it was not saving history for https pages. I have not investigated fully but if I was forced to use Internet Explorer, I would prefer to be subject to snooping by the NSA than lose my browsing history. RHaworth (talk) 21:07, 21 August 2013 (UTC)
Impossible Log out
editSince tonight, I can not log out. It's some feature of https? --ŠJů (talk) 20:50, 22 August 2013 (UTC)
- I usually use the secure login and have never had that problem, so I wouldn't say it's a feature. Did you try going to Special:UserLogout, at least o the secure server? 72.252.242.170 23:20, 22 August 2013 (UTC)
- The page Special:UserLogout appeared and the login from the top of the page dissappeared but in 1 or 2 second appeared again. However, the bug seems to be eliminated now. --ŠJů (talk) 00:53, 23 August 2013 (UTC)
http/https link embedded in wiki text using the link tool …
edit… does not work. If you select the link tool, and paste an url, it should be stripped off the protocol to get protocol relative urls automatically. At least for WP links like this (completely arbitrary): https://de.wikipedia.org/w/index.php?title=Benutzer_Diskussion:Koordinatenfreak&diff=118605869&oldid=118178984 the link tool shall strip the protocol. Adding //de.wikipedia.org/w/index.php?title=Benutzer_Diskussion:Koordinatenfreak&diff=118605869&oldid=118178984 to the link tool will create //de.wikipedia.org/w/index.php?title=Benutzer_Diskussion:Koordinatenfreak&diff=118605869&oldid=118178984 (url with prefix http:////de.wikipedia.org). Furthermore the wiki text (protocol relative url) //de.wikipedia.org/w/index.php?title=Benutzer_Diskussion:Koordinatenfreak&diff=118605869&oldid=118178984 should autolink like the non-protocol-relative-url does. --Herzi Pinki (talk) 23:02, 22 August 2013 (UTC)
API
editIs this going to affect browser logins only, or will pure API sessions (those that occur outside a browser environment) be affected as well? This, that and the other (talk) 09:17, 25 August 2013 (UTC)
- This is only for browser logins, the api does not force a redirect CSteipp (talk) 22:47, 26 August 2013 (UTC)
Causing problems
editI have disabled this in my preferences, as the default left me totally unable to edit, or even view, Wikipedia. (Every time I clicked on a link, or even typed in a url, I was switched to an https url, with a popup window "couldn't parse page name from url ...". Previously, if directed to an https page, I have had to remove the s in order to open the page, but the new change made this impossible. Eventually I managed to open my preferences, find the relevant entry, and uncheck the tick; now everything works smoothly again.
I have no idea what is causing the problem; whether it is my Wikipedia gadgewts, my Firefox add-ons, or my security settings. Nor do I feel inclined to experiment and see what tweaks, if any, would resolve the problem. But it is clear to me that this change should not become a default, or be made mandatory, since this would certainly lock out readers and editors.
If anyone responds to this, please inform me at my talk page on English Wikipedia; I have never before visited Wikimedia. RolandR (talk) 00:13, 29 August 2013 (UTC)
- I'm pretty sure this is due to a script you have installed (as I indicated on the VPT thread). I fixed it. ^demon (talk)
- That doesn't seem to have resolved the problem. I re-enabled the secure login in my preferences, and again started to get the "couldn't parse page name" pop-up. Which of my scripts might be causing this? RolandR (talk) 10:56, 29 August 2013 (UTC)
- In fact, this seems to have made the problem worse; now, although I have unticked the checkbox in Preferences, every link defaults to https, and I am unable to open any page easily. Please help! 86.186.105.211 11:10, 29 August 2013 (UTC)
- If you wish to change the HTTPS setting back to HTTP, try going to [1] or [2].
- If you wish to fix the problem, try going to w:User:RolandR/monobook.js and w:User:RolandR/vector.js and change "http://" to "//" everywhere on the pages. The JS files can only be edited by you or by an administrator at English Wikipedia.
- If this doesn't help, then it probably means that something is wrong in one of the scripts that you are importing. --Stefan2 (talk) 12:31, 29 August 2013 (UTC)
- In fact, this seems to have made the problem worse; now, although I have unticked the checkbox in Preferences, every link defaults to https, and I am unable to open any page easily. Please help! 86.186.105.211 11:10, 29 August 2013 (UTC)
- That doesn't seem to have resolved the problem. I re-enabled the secure login in my preferences, and again started to get the "couldn't parse page name" pop-up. Which of my scripts might be causing this? RolandR (talk) 10:56, 29 August 2013 (UTC)
Hello,
I also have some problems. First, I didn't get the images until I clicked to access a Commons page and validate the security certificate; now solved. But it seems the scripts don't work. I have no personnalised JS or CSS (the /monobook.js page or the like don't exist).
Any clue?
Google Translate
editWhen I paste "https://fr.wikipedia.org/wiki/Sp%C3%A9cial:Gadgets" into either http://translate.google.com/ or https://translate.google.com/ I get the message:
- Google Error
- Sorry, this URL is invalid
- http://fr.wikipedia.org:443/wiki/Sp%C3%A9cial:Gadgets
But when I paste the regular "http://fr.wikipedia.org/wiki/Sp%C3%A9cial:Gadgets", it translates it fine. Can someone attest that this is a Wikimedia problem and, if so, submit a bug about it? - dcljr (talk) 08:37, 29 August 2013 (UTC)
- Looks like a bug in the google tool to me... Apparently it has existed for years already. Perhaps WMF operations can tickle their google contacts a bit. TheDJ (talk) 11:52, 29 August 2013 (UTC)
- I've had this problem ever since I started using HTTPS (a while ago). It's a problem with Google Translate. @Snowolf: didn't you write to a mailing list about this last year? PiRSquared17 (talk) 18:55, 31 August 2013 (UTC)
- I did. I don't think they ever replied to me. Snowolf How can I help? 22:09, 31 August 2013 (UTC)
- Hmm. Maybe now that Wikimedia is going HTTPS for all logged-in users on all projects, they'll get more pressure to do something about it. - dcljr (talk) 01:26, 2 September 2013 (UTC)
- I did. I don't think they ever replied to me. Snowolf How can I help? 22:09, 31 August 2013 (UTC)
- I've had this problem ever since I started using HTTPS (a while ago). It's a problem with Google Translate. @Snowolf: didn't you write to a mailing list about this last year? PiRSquared17 (talk) 18:55, 31 August 2013 (UTC)
I work for Google, and got the following answer from Google Translate team:
"Currently, the Google proxy-based web translation cannot translate HTTPS pages. It's a known limitation, and we'd like to remove that limitation."
- OK, well, there's a very easy fix for that… - dcljr (talk) 02:23, 17 September 2013 (UTC)
I added this section to help users particularly with script problems or certificate problems. Be bold at rewriting it or adding issues, I have only my own experience of HTTPS problems or those reported on the village pumps (although always difficult to precisely understand what is wrong and consider solutions).
For the scripts I guess this is quite important and many editors could wonder why their scripts suddently do no more work, see these few searches: [3] (search the exact case loadJS), [4], [5], [6]; even, some users call external scripts (sometimes the destination server don’t support https). For these scripts, I don’t know what exactly should be prefered, is it importScriptURI
? not mw.loader.load
?
About the Authority certificate (DigiCert High Assurance EV Root CA), is there known issues with the devices which don’t have it in their certificate repository? I use to see it on the Internet and with my browsers (Opera, Firefox), but perhaps there are exotic devices or browsers or operating systems or combinaisons of these ones which don’t recognize it. ~ Seb35 [^_^] 19:15, 29 August 2013 (UTC)
Help Needed: Problems due to Secure site change
editI'm not even sure how I found my way here to this page, but I am in desperate need of help, as nothing I have tried has made any real difference. Rather than repeat what I've already written over at Wikipedia, I am going to re-post my own initial comment, followed by the entire contents of the discussion that ensued when I created a new section under the same heading I've used here -- which I think may be useful for other editors to read. Cgingold (talk) 07:08, 2 September 2013 (UTC)
I) My browser history is absolutely integral to how I do things on the internet. When my browser history first disappeared, I assumed that the problem was somewhere on MY end of things. I racked my brain, I checked everything -- all to no avail. I had no idea that Wiki had switched over to secure pages, so I was really at my wits' end until I just by chance finally noticed that little "https" in the URLs. <Light bulb goes on> I then spent an hour googling a variety of search strings in hopes of finding something that would help resolve the problem. Again to no avail. Finally, I came here [to the Village Pump - technical] to see if anybody had raised the issue.
Wikipedia discussion at Village Pump (technical) talk page
edit- ALL EMBEDDED LINKS ARE FOR WIKIPEDIA
As other editors have already made note of, the change-over to Secure URLs is causing problems for a lot of us. I've already explained above how upsetting it was to have my Wikipedia browsing history suddenly disappear. I mistakenly thought that unchecking the setting in my preferences had taken care of the problem. Not So. It turns out that when I specify one of the normal/non-secure Wikipedia URLs, the software no longer recognizes me as a logged-in user -- iow, the pages display as they do for any anon. IP user (and as I discovered by accident, my IP address shows up in the edit history, as one would expect). But as soon as I go to a secure URL, everything goes back to normal -- and that's without having to log in again; iow, it keeps me logged in, as it should. But those secure pages don't show up in my browser history, no matter what I do. Yes, I've changed & re-changed my preference setting and logged back in several times. And I've also looked for the trouble-making cookies that were mentioned above -- but curiously, I don't seem to have ANY cookies for Wikipedia, even though it is one of the few sites where I expressly allow them by default. <sigh> Any help on this would be hugely appreciated! Cgingold (talk) 08:34, 1 September 2013 (UTC)
- You don't actually say - either here or at your previous post - which browser you're using. --Redrose64 (talk) 09:56, 1 September 2013 (UTC)
- Ah, I was wondering why when I open a new browser window it shows me as not logged in. Kudpung กุดผึ้ง (talk) 09:54, 1 September 2013 (UTC)
- Well, if anyone is interested, I am using Ffox 23.0.1 on MacOS 10.8.4. Looks as if the default move to https has thrown more than one spanner in the works. Kudpung กุดผึ้ง (talk) 10:36, 1 September 2013 (UTC)
- I too am using Firefox, although under Windows XP. Here's how I successfully went back to http:
- Go to Preferences, switch off "Always use a secure connection when logged in" and save
- Log out of Wikipedia and close all browser tabs that contain Wikipedia pages - but keep the browser itself open
- In the browser menu bar, go to Tools → Options → Privacy
- Click the link "remove individual cookies", this opens a new window titled "Cookies" - maximise that, it makes things easier
- In the search bar of "Cookies", enter forceHTTPS exactly like that - it's case-sensitive
- Highlight a row and click Remove Cookie (the Delete key works on a PC, don't know about a Mac)
- Repeat until all are gone, then click Close
- Visit http://en.wikipedia.org/wiki/ and log in
- You should now be back on http: --Redrose64 (talk) 10:50, 1 September 2013 (UTC)
- Thanks for your detailed reply, Redrose64. Hopefully that will be very helpful for any Firefox users who find their way to this page. Unfortunately it doesn't help me, perhaps because I am using IE8 (although they are very similar). I had already tried to find those cookies (they were also mentioned above), but as I noted, I was unable to turn up ANY cookies at all for Wikipedia among the cookies listed in my TIF folder. Very puzzling, to say the least. Cgingold (talk) 12:52, 1 September 2013 (UTC)
- Try looking in your Cookies folder. I don't quite understand why, but Internet Explorer's cookies can be found in both "Temporary Internet Files" and "Cookies". – PartTimeGnome (talk | contribs) 20:28, 1 September 2013 (UTC)
- Thanks for your detailed reply, Redrose64. Hopefully that will be very helpful for any Firefox users who find their way to this page. Unfortunately it doesn't help me, perhaps because I am using IE8 (although they are very similar). I had already tried to find those cookies (they were also mentioned above), but as I noted, I was unable to turn up ANY cookies at all for Wikipedia among the cookies listed in my TIF folder. Very puzzling, to say the least. Cgingold (talk) 12:52, 1 September 2013 (UTC)
- I too am using Firefox, although under Windows XP. Here's how I successfully went back to http:
- New Wrinkle - For a time, I was able to move from page to page in the non-secure universe, and at least those pages would show up in my browser history. But something seems to have changed. Now I cannot access ANY non-secure pages -- even when I specify "http" in the address bar of my browser, it just redirects to "httpS", regardless of whether I am logged IN or logged OUT. I think I may be starting to lose my mind. Cgingold (talk) 13:18, 1 September 2013 (UTC)
- What kind of windows edition do you use ? It seems that IE in the Windows server and possible professional editions seem to not store anything by default if the user is using https. —TheDJ (talk • contribs) 15:18, 1 September 2013 (UTC)
- I'm using Windows 7. What's crazy about all of this is that unchecking the new default setting in preferences seems to have no effect whatsoever. To repeat, yes I've logged out and back in multiple times, and the setting remains unchecked. I've also looked and searched for those problem cookies, but cannot locate them, if they're even there at all. And I've never seen a Cookies folder, either. Please keep thinking about this -- your efforts are greatly appreciated! Cgingold (talk) 01:36, 2 September 2013 (UTC)
- What kind of windows edition do you use ? It seems that IE in the Windows server and possible professional editions seem to not store anything by default if the user is using https. —TheDJ (talk • contribs) 15:18, 1 September 2013 (UTC)
- Update: I can now see my Cookies Folder. It took some doing -- no thanks to Microsoft! -- to find out how to get that folder to display. (It requires un-checking the default setting that hides all "protected operating system files".) Unfortunately, the only Wiki cookies I found were quite innocuous. No sign of those problem cookies that force the browser to use https. So I'm right back where I started: Still no way to get my browser to display non-secure pages -- and still no browser history for Wikipedia. :( Cgingold (talk) 03:12, 2 September 2013 (UTC)
My experience with HTTP links
editThe instructions here say, "If a user wants to login, the login will happen over HTTPS… and after they are logged in they stay on the HTTPS version of the Wikimedia site they are using." We-e-ell... I suppose this is technically true, but it doesn't quite work the way most people would expect.
After logging out and logging in again a couple of days ago (on the English Wikipedia) under the new HTTPS regime, I noticed that when following HTTP links to most other Wikimedia wikis from e-mails and bookmarks, I was not showing up as being logged in (was arriving under the HTTP protocol, of course). My bookmarks to the English Wikipedia and Meta seemed to be okay (I have visited those since the changeover, so presumably I already did something to fix the problem there), but if I tried my bookmark to, say, the Italian Wikibooks (to choose a random one I hadn't visited in a while), I showed up as being logged out. Reloading didn't help. Reloading bypassing the cache didn't help. Manually typing in the prefix "https://" in the address bar... that did it. Then I was logged in. (So... I have to actually use HTTPS to "remind" WM that I'm already logged in as a registered user?)
Needless to say, I didn't want to manually type in a protocol every time I visited a new Wikimedia wiki (I help to maintain Wikimedia News, so I have occasion to visit "random" WM wikis all the time)! So I did something I really hate doing: I cleared all my cookies. And all my cookie settings (allow/deny/etc.). And my cache. And my browsing history. And then I closed and reopened Firefox (17.0.7).
After logging in to the English Wikipedia (my "home wiki") again, I found that when I first visited another WM site (in this case, Meta) — via a regular wikilink, no less... not even bookmarks or e-mail links yet — I showed up as not logged in! (And I had to accept a new cookie, which I expected, of course.) Then on to Commons: not logged in! Hey! Whatever happened to SUL?? Okay, so then I reloaded the page. Only then did it tell me "You are centrally logged in… Reload the page to apply your user settings." Okaaay. Reloaded a second time. Fine. But before reloading, I noticed that I was actually already logged in as me after the first reload. Something's screwy here... Finally, by going to another "new" wiki, I noticed that the first and second reloads can be replaced by two "local" page loads of any sort (i.e., following links to other pages on the same wiki). In other words, I'm not logged in when I first bring up a wiki (a Main Page, in particular), but after either reloading or following an internal link, I am logged in. But only then do I get the message about "[applying my] user settings". But at that point that's already been done! Something tells me this is not how it's supposed to work.
So then I tried an old HTTP bookmark (this time to the Czech Wikipedia — remember, I first logged in to the English Wikipedia). Not logged in. Not in HTTPS. Reloading didn't work. Back to where I started. So then I navigate through HTTPS links (via List of Wikipedias) back to the Czech Wikipedia. I am logged in under HTTPS. Okay, back to the List and then to the, oh, Ilokano Wikipedia. I am logged in under HTTPS.
Hmm. Okay, back to an HTTP bookmark to the Ilokano Wikipedia. Nope. HTTP and not logged in. (But I was just here! Don't you recognize me?) Finally, to an HTTP bookmark to the English Wikipedia. This time, mercifully, I show up as logged in under HTTPS.
So, the moral of this story seems to be: If you explicitly log in to a Wikimedia wiki, HTTP links to that wiki will come up as HTTPS and have you logged in. Following links to any other Wikimedia wiki (assuming they're HTTPS) works, but only after reloading or following a link. But visiting any other wiki through an HTTP link, even though you have purportedly been "CentralAuth" logged in to "all" of them, you don't get recognized as being logged in and remain an anon under HTTP. Pretty annoying.
So now I have to go through 820 or so bookmarks changing "http:" to "https:" (not to mention actually visiting all 820 wikis again to set new cookies). Yay. - dcljr (talk) 10:45, 2 September 2013 (UTC)
- When I got "You are centrally logged in… Reload the page to apply your user settings.", I checked Special:Preferences and found that the setting for "Always use a secure connection when logged in" was opposite to what I thought it was. Flipping it and saving, then reloading the appropriate page, got things back in synch. It was necessary to do this separately for meta, commons, etc. --Redrose64 (talk; at English Wikipedia) 12:09, 2 September 2013 (UTC)
- Is this bugzilla:53538? --Stefan2 (talk) 13:40, 2 September 2013 (UTC)
- My original report? Yeah, looks like it (although the thing about the late "centrally logged in" notice isn't mentioned there). I guess we'll see what happens when 1.22wmf16 is deployed (over the next few days). - dcljr (talk) 12:47, 5 September 2013 (UTC)
- I've been having some trouble staying logged in between projects (i.e. Commons) for a long time - well before https. But it happens only when I have some scripts enabled (NoScript), and admittedly... I might not have had them all enabled at the start, and so gotten to some state unexpected by the developers. Have you been using the same sort of script control? Wnt (talk) 18:00, 2 September 2013 (UTC)
- What browser are you using? If the problem is that you are logged in when you use HTTPS but not when you use HTTP, then you might wish to install HTTPS Everywhere, provided that you use a browser compatible with that. --Stefan2 (talk) 18:20, 2 September 2013 (UTC)
- Is this bugzilla:53538? --Stefan2 (talk) 13:40, 2 September 2013 (UTC)
Updates on the Https situation
edit[Note: I've copied this from the Village Pump (technical) talk page over at Wikipedia.]
- Update1 - Hmmm. Has something been tweaked on your end? (i.e. some lines of code at Wikipedia/Wikimedia?) Just in the last hour I discovered that I was able to access Http pages while I'm logged in -- which I was NOT able to do before (except for a very short period over the weekend). They're also showing up in my browser history, as expected. And I can see that HotCat has returned, as well (though I haven't tried using it as of yet). However -- you knew that was coming, didn't you?? -- there's still some work to be done: When I accessed a couple of articles via redirects, they displayed as though I had suddenly logged out, even though I hadn't done so and was in fact demonstrably still logged in! (The links on those pages took me to normal pages that displayed properly.) I'll post another update later on today. Needless to say, I'm keeping my fingers crossed that things won't revert to dysfunctionality in the next couple of hours! Cgingold (talk) 14:48, 2 September 2013 (UTC)
- Addendum to Update1: Complicating the picture I described in my update above, I just had a couple of pages accessed via redirects display properly -- but a couple of Category pages displayed improperly, as though I had logged out. So it seems there's no real pattern to whatever the hell is going on. <sigh> Cgingold (talk) 15:33, 2 September 2013 (UTC)
- Update2 - Okay, it is now many hours later in the day. The positive developments I noted earlier have held up without reverting to the intractable problems I experienced over the weekend. For the most part, I am able to access Http pages and they display properly -- i.e. the Wiki software recognizes that I am logged in. I can also report that HotCat is functioning normally for me -- another major relief. However, there is still a recurring problem that crops up in a seemingly random fashion, causing certain pages to display as though I was not logged in -- even though I can go directly to other pages that display normally, without having had to log in again first. I suspect, however, that these problem pages may not be entirely random -- iow, they may possibly have some feature in common that results in problems with the new software -- mainly because certain pages continue to exhibit this problem every time I access them. Case in point: Category:Biological evolution, which displays as though I'm just an Anon. IP editor every time. Guess that'll do it for now. I hope I've given you techies enough "hints" to make some headway in tracking down the source of the problem. :) Cgingold (talk) 04:32, 3 September 2013 (UTC)
Security information
editWhen browsing wikipedia (via firefox, if it makes a difference), now in https, getting the security information by pressing the lock icon next to the URL, the browser reports "you are connected to wikipedia.org which is run by (unknown)". Similarly, meta.wikimedia.org is also reported "unknown". When clicking "more information" the browser reports under identity "owner: This website does not supply ownership information". Methinks this warrants correction. 87.68.44.102 10:02, 4 September 2013 (UTC)
- I was curious, and checked, 12 days later. Using Google Chrome browser, both the identity and certificate seem proper, as
meta.wikimedia.org
, using SSL with DigiCert High Assurance CA-3 and 128-bit encryption connection SHA-1. I didn't check the entire CA chain, I confess. --FeralOink (talk) 13:26, 20 September 2013 (UTC)
Wikimedia sites and tracking, documentation
editFirst, I wanted to mention that HTTPS is working perfectly for me, every time, on every WMf site, with the possible exceptions of Wikidata. Thank you!
Earlier, on 21 August 2013, inquiries were made by Risker
"Please add a link to some formal statement or roadmap or approved plan that says "[t]he continued work to improve the security and privacy of Wikimedia project users is a stated goal of the Wikimedia Foundation". (I'm not doubting it's true, but a footnote/reference helps.) It might also be useful to link to some internal or external site that confirms the use of Wikipedia as a site useful for tracking"
and by Ace111
"The preamble of this article mentions: "The Wikipedia project was referenced in several documents as a specific source for tracking users' behavior online". Which documents are referenced here? Was it discussed somewhere?"
These concerns were acknowledged and action was taken to address them by Seb35
"I just added a link to one document I know, but I don’t know others."
However, I agree with Ace111's subsequent response:
"Jimbo's twitter post refers to an NSA slide via The Guardian with a question: "Why are we interested in HTTP?" and an answer "Because nearly everything a typical user does on the Internet uses HTTP", followed by examples of web sites such as Facebook, Yahoo, Twitter, myspace, CNN, mail.ru, Wikipedia, Google and Gmail. I don't think this is a proper reference for Wikipedia as a specific source for tracking users' behavior."
Jimmy Wales' Twitter message reads as follows:
NSA snooping on what YOU are reading at Wikipedia means I want us to go to SSL sooner. Bastards fuck off. Owly
The Owly-shortened URL directs to this image via The Guardian, which I haven't uploaded, though I guess I should. It is not a COPYVIO, as it is an alleged screen shot of antiquated social media icons in the characteristically MS Paint, circa Win 95 style, of PRISM slides. There is some lettering at the top, in light orange colored font, on a darker orange background, which might say "COMINT" (or maybe HUMINT or NOfORN or one of those other acronyms we've had 4 months with which to become familiar). That isn't suficient as a proper reference for Wikipedia as a specific source for tracking users' behavior, just as Ace111 said. In fact, Jimmy Wales message was directly replied to by Steven Walling
"the Foundation has not received any requests to date" https://blog.wikimedia.org/2013/07/18/wikimedia-foundation-letter-transparency-nsa-prism/ …
which leads me to wonder whether there were evidence of tracking by the NSA of Wikipedia users and/ or editors et al. Next, even though he sounds full of bravado, the content of Jimmy Wales' Twitter message is rather hostile, specifically,
- Bastards fuck off.
Jimmy Wales as an individual is entitled to tell the NSA to fuck of. But using his personal Twitter account as the basis for Wikimedia's decision to transition to SSL, with that particular content, is representing Wikimedia foundation I would think. We don't tell any government entity to fuck of as official policy, I don't think. We didn't tell Khazakhastan to fuck of, nor Syria, nor Libya, nor North Korea, nor the Taliban. I don't need to list further bete-noire perceived countries, as you get the idea. Remember, the NSA is acting as explicitly directed by the elected government of the USA, the executive branch and/ or the Department of Homeland Security. Bad men tell the NSA to do bad things.
I don't want that Twitter message linked in the article, here, please? I fully agree with, and applaud Wikimedia's decision to transition to HTTPS and SSL. If further encryption measures are added in the future, I wouldn't argue against it.
In fact, why is there any need to justify the switch to HTTPS at all? Someone needs to find some non-self-referential external source (or provide better detail from this one, The Guardian article, not Jimmy Wales), or just delete that section entirely, and thus resolve the pesky issue of explaining why SSL is needed. I will take it upon myself to do so, but wanted to explain myself first, and generate discussion, as the AfD folks say! --FeralOink (talk) 01:22, 17 September 2013 (UTC)
Still being forced into using HTTPS on occasion
editJust now, I had to go thru the once every 30 days log in again. And despite having that I do NOT want HTTPS, it forced me into HTTPS and I had go and manually delete the cookies yet again! I understand why you are advocating HTTPS, but they don't apply to me or my use of Wikimedia, so I like to keep things as simple as possible. It's bad enough that each time I visit a Wikimedia site I haven't used since the change I have to go change the preferences for that site to undo the automatic HTTPS, but to have to keep deleting cookies is even worse. Carolina wren (talk) 02:15, 29 September 2013 (UTC)
- This sounds like bug 54626, the fix for which should be deployed starting on Thursday, October 3 and finishing up on October 10. Unless it gets pushed out sooner. BJorsch (WMF) (talk) 13:58, 30 September 2013 (UTC)
Preference
editAfter this program is implemented, will individual contributors still be able to disable HTTPS if so they desire? Gryllida 22:21, 17 September 2014 (UTC)
- Why is the above question written in the future tense? This was implemented last year. Individual contributors can disable HTTPS at Special:Preferences#mw-prefsection-personal by unchecking the "Always use a secure connection when logged in" checkbox. --Stefan2 (talk) 22:52, 17 September 2014 (UTC)
- I'm referring to the planned change. In October 2014, some projects (not globally; currently this is on Russian Wikipedia) plan to roll out https by default to unregistered contributors. Some others are considering following their example and I'm trying to access the implications of such change. Gryllida 23:03, 17 September 2014 (UTC)
API from contributor's username
editI understand that the program implies that where I visit a page, I will be automatically redirected to HTTPS. Will the same happen for API queries I send from a bot/script? Gryllida 22:21, 17 September 2014 (UTC)
- I'm referring to the planned change. In October 2014, some projects (not globally; currently this is on Russian Wikipedia as a result of community consencus) plan to roll out https by default to unregistered contributors. Some others are considering following their example and I'm trying to access the implications of such change. Gryllida 23:04, 17 September 2014 (UTC)
Common HTTPS access issues
editWhat is the common nature of the HTTPS access issues the https wikimedia.org folks encounter, as suggested here? I e-mailed them in but got no response. I would like to know who we would be locking out here and why. Gryllida 22:21, 17 September 2014 (UTC)
- HTTPS access is or has been blocked by the authorities in some countries (see HTTPS#Excluded Countries). I assume that users in these countries may run into problems if they have to use an HTTPS connection. I do not know if users also may have other problems. --Stefan2 (talk) 22:55, 17 September 2014 (UTC)
- I was told that this was addressed by extensive EventLogging tests that didn't reveal trouble of "random crappy ISP setup" sort, but only revealed country-specific issues (and this program is being rolled out outside of these countries as I understand). Would be useful to have some documentation of tests, for reference. Gryllida 22:57, 17 September 2014 (UTC)
Opting out for unregistered contributors
editThis is perhaps a slightly useless question, as the folks can use another internet access point to register, and opt out then (if the preference is still available). But is there currently no means to opt out without registering? See also bugzilla:20151. Can we think about it again? To me, a lot of preferences appear to be harmless to expose to IP contributors (HTTPS/HTTP toggle being one of them). Gryllida 22:57, 17 September 2014 (UTC)
- To clarify: I'm referring to the planned change. In October 2014, some projects (not globally; currently this is on Russian Wikipedia) plan to roll out https by default to unregistered contributors. Some others are considering following their example and I'm trying to access the implications of such change. Gryllida 23:04, 17 September 2014 (UTC)
- Where do I find information about the planned change? I hadn't heard of it until you mentioned it on this talk page.
- Users who are not logged in can still change many settings by using a tool such as w:Greasemonkey. However, I realise that this method isn't very user-friendly. I don't know if this method provides a way to opt out from forced HTTPS.
- Are you suggesting that some users won't be able to visit any Russian Wikipedia pages (including ru:Special:UserLogin?) unless the users are logged in when the page is accessed? --Stefan2 (talk) 23:28, 17 September 2014 (UTC)
- I have no idea what plans Gryllida is talking about, but official plans are at mw:Wikimedia Engineering/2014-15 Goals#Site Operations and Site Architecture.
- So far zh.wiki and others had an opt-out from the forced HTTPS for registered users but I seriously doubt there will be any opt-in mechanism for further HTTPS changes. --Nemo 06:06, 18 September 2014 (UTC)
- No, they don't plan access changes. Local consensus at Russian Wikipedia. Oh, and they opted in here. Gryllida 09:09, 18 September 2014 (UTC)
- Gryllida, Where did you find this date "October 2014", сan you give a link to us? Sunpriat (talk) 20:08, 26 September 2014 (UTC)
Russian Wikinews
editPlease, let me know when you turn on HTTPS for Russian Wikinews (diff). --sasha (krassotkin) 07:36, 22 September 2014 (UTC)
- Done. Thanks! --sasha (krassotkin) 07:16, 5 December 2014 (UTC)
Performance impact
editDo performance issues with HTTPS show up in any way on navigation timing EventLogging logs like those from which [7] is extracted? Consider the case "the page takes so long to load over HTTPS that it times out in my browser" for people with extremely slow connections. It would be nice to be at least able to assess the impact on specific countries or Wikimedia subdomains, for instance. IIRC even the simple but often-quoted [8] mention this need for breakdown. --Nemo 13:25, 12 June 2015 (UTC)
Turning off HTTPS
editThis article's content states: "If so, you can turn off HTTPS in your user preferences (unless HSTS is set for the domain in question[2]). Go to the "User profile" tab, then uncheck the box labelled "Always use a secure connection when logged in". You will need to logout and login for the preference to take effect. But remember, you will still need to log in using HTTPS." Up until today, I did not use HTTPS and was editing just fine. Today, I am being forced to use HTTPS and there is nothing in the preferences screen for me to control this. I don't want HTTPS (loss of saved edit summary info). What can I do to get back to HTTP? Hmains (talk) 17:14, 12 June 2015 (UTC)
- Sorry, that's entirely my fault for not keeping this page as up to date as I'd had hoped to. This is the answer I got when I asked was that any form of opt-out would also leave potential security risks in our implementation which make it difficult to safeguard those who do not opt-out. Because of this, we’ve made implementation decisions that preclude any option to disable HTTPS, whether logged in or not. This renders the current opt-out option ineffective, and the option will be removed at a later date after we’ve completed the transition process.
- I'm trying to figure out if there's a good solution to the edit summary info problem. /Johan (WMF) (talk) 19:27, 12 June 2015 (UTC)
Bots
edit"API requests are now being forced redirected to HTTPS on some languages. On most bot frameworks, this should just work, but a few bot maintainers have reported problems. If your bot has problems, make sure that the library you are using supports HTTPS, and has access to appropriate root certificates."
- How do you fix a library that doesn't support https to make it support that?
- What are root certificates, and what do I need to do regarding that?
- Why is this notice posted after implementation?
- My library (see en:User:RMCD bot/botclasses.php) is something I got off of the now-defunct German tool server, and was written by guys who have all apparently retired from Wikipedia, so I guess I need to assume the job of maintaining it, though I'm not really sure that someone isn't still maintaining it somewhere.
- I've been aware of this (potential) issue since last December: en:Wikipedia:Village pump (technical)/Archive 133#PHP API for #ifexist function? My last question in that thread never got a response.
Wbm1058 (talk) 12:26, 14 June 2015 (UTC)
Help! My code is broken!
edit"Are you a bot maintainer... and you're seeing weird or broken behavior after this switch?" YES. "Hopefully you can fix that easily." Still hoping. It's apparently not as easy as simply changing "http:" to "https:" in the code
For Gadget authors, simply modifying any hardcoded urls from "http://..." to "//..." should fix the issue (this is called using "protocol relative urls"). My bot logs me in if I use http:// Neither https:// nor // work to even do that.
"For bot maintainers, you have a couple of choices."
- login as the bot and select the preference to not use HTTPS for that account (citation needed)
- update your code to use SSL instead. Do you mean Secure Sockets Layer? How do I update my code to use it?
"If you use Pywikipediabot, please update to the latest version (More technical details: [9],[10]) and read this mailing list post for more information."
- I don't believe I use Pywikipediabot. Can't you do better than linking to code reviews for "more information"?
What's the difference between a "Secure Sockets Layer" and a "root certificate? Wbm1058 (talk) 13:53, 14 June 2015 (UTC)
- Curiouser and curiouser. Legobot, which uses the same botclasses.php library, is still working. Wbm1058 (talk) 16:56, 14 June 2015 (UTC)
- There has been some divergence between versions of the botclasses.php library. See en:Wikipedia:Village pump (technical)#Impending bot armageddon for the console output from my program. Based on the prior conversation linked above, I switched my version to use JSON. I just copied Legobot's version into my system, and that errors differently:
Logging in...
POST: http://en.wikipedia.org/w/api.php?action=login&format=php (1.2910737991333 s) (194 b)
POST: http://en.wikipedia.org/w/api.php?action=login&format=php (1.1130640506744 s) (253 b)
...done.
Checking for transclusions...
GET: http://en.wikipedia.org/w/api.php?action=query&list=embeddedin&eititle=Template%3ARequested+move%2Fdated&eilimit=500&format=php (0.23901295661926 s) (0 b)
GET: http://en.wikipedia.org/w/api.php?action=query&list=embeddedin&eititle=Template%3ARequested+move%2Fdated&eilimit=500&format=php (0.34102010726929 s) (0 b)
GET: http://en.wikipedia.org/w/api.php?action=query&list=embeddedin&eititle=Template%3ARequested+move%2Fdated&eilimit=500&format=php (0.52202987670898 s) (0 b)
GET: http://en.wikipedia.org/w/api.php?action=query&list=embeddedin&eititle=Template%3ARequested+move%2Fdated&eilimit=500&format=php (0.37602114677429 s) (0 b)
GET: http://en.wikipedia.org/w/api.php?action=query&list=embeddedin&eititle=Template%3ARequested+move%2Fdated&eilimit=500&format=php (0.53903079032898 s) (0 b)
GET: http://en.wikipedia.org/w/api.php?action=query&list=embeddedin&eititle=Template%3ARequested+move%2Fdated&eilimit=500&format=php (0.35002017021179 s) (0 b)
GET: http://en.wikipedia.org/w/api.php?action=query&list=embeddedin&eititle=Template%3ARequested+move%2Fdated&eilimit=500&format=php (0.51002883911133 s) (0 b)
GET: http://en.wikipedia.org/w/api.php?action=query&list=embeddedin&eititle=Template%3ARequested+move%2Fdated&eilimit=500&format=php (0.57503318786621 s) (0 b)
GET: http://en.wikipedia.org/w/api.php?action=query&list=embeddedin&eititle=Template%3ARequested+move%2Fdated&eilimit=500&format=php (0.5550320148468 s) (0 b)
GET: http://en.wikipedia.org/w/api.php?action=query&list=embeddedin&eititle=Template%3ARequested+move%2Fdated&eilimit=500&format=php (0.36001992225647 s) (0 b)
GET: http://en.wikipedia.org/w/api.php?action=query&list=embeddedin&eititle=Template%3ARequested+move%2Fdated&eilimit=500&format=php (0.36002087593079 s) (0 b)
Fatal error: Uncaught exception 'Exception' with message 'HTTP Error.' in C:\php\botclasses.php:214
Stack trace:
- 0 C:\php\botclasses.php(212): wikipedia->query('?action=query&l...', NULL, 10)
- 1 C:\php\botclasses.php(212): wikipedia->query('?action=query&l...', NULL, 9)
- 2 C:\php\botclasses.php(212): wikipedia->query('?action=query&l...', NULL, 8)
- 3 C:\php\botclasses.php(212): wikipedia->query('?action=query&l...', NULL, 7)
- 4 C:\php\botclasses.php(212): wikipedia->query('?action=query&l...', NULL, 6)
- 5 C:\php\botclasses.php(212): wikipedia->query('?action=query&l...', NULL, 5)
- 6 C:\php\botclasses.php(212): wikipedia->query('?action=query&l...', NULL, 4)
- 7 C:\php\botclasses.php(212): wikipedia->query('?action=query&l...', NULL, 3)
- 8 C:\php\botclasses.php(212): wikipedia->query('?action=query&l...', NULL, 2)
- 9 C:\php\botclasses.php(212): wikipedia->query('?action=query&l...', NULL, 1)
- 10 C:\php\botclasses.php(494): wikipedia->query('?action=query&l...')
- 11 C:\php\requestedmoves.php(53): wikipedia->getTransclusions in C:\php\botclasses.php on line 214
/**
* Sends a query to the api.
* @param $query string The query string.
* @param $post string POST data if its a post request (optional).
* @param $repeat int how many times we've repeated this request
* @return array The api result.
* @throws Exception on HTTP errors
**/
function query ($query,$post=null,$repeat=0) {
global $AssumeHTTPFailuresAreJustTimeoutsAndShouldBeSuppressed;
if ($post==null) {
$ret = $this->http->get($this->url.$query);
} else {
$ret = $this->http->post($this->url.$query,$post);
}
if ($this->http->http_code() == "504" && $AssumeHTTPFailuresAreJustTimeoutsAndShouldBeSuppressed) {
return array(); // Meh
}
if ($this->http->http_code() != "200") {
if ($repeat < 10) {
return $this->query($query,$post,++$repeat);
} else {
throw new Exception("HTTP Error."); // this is line 214
}
}
return unserialize($ret);
}
I don't know whether this would make any difference, but unlike most other bots, mine run under the Windows 7 operating system. -- Wbm1058 (talk) 18:40, 14 June 2015 (UTC)
It runs under Windows PowerShell. I'm wondering whether I need to do anything special to enable HTTPS under PowerShell. Wbm1058 (talk) 18:55, 14 June 2015 (UTC)
- BINGO! - this seems helpful: Enabling HTTPS protocol for PowerShell remoting Wbm1058 (talk) 19:07, 14 June 2015 (UTC)
- ...but complicated. Buy an SSL certificate? Generate my own?? Help! Wbm1058 (talk) 19:19, 14 June 2015 (UTC)
Sigh, it's awful quiet here today. Why was this implemented just before a weekend? So this is my current understanding (I have limited confidence in its accuracy). The library I'm using does not support HTTPS. It works for Legobot because that bot is on a "framework" (UNIX?) that supports HTTPS. I can either update my library code to support HTTPS, or update my Windows PowerShell "framework" to support HTTPS. I have no idea which is the easier approach. Wbm1058 (talk) 21:34, 14 June 2015 (UTC)
That code you linked to does not support a redirect (HTTP status/code 301) which is what you get when you try to request a HTTP:// URL instead of HTTPS:// . Perhaps it works for you when you change the URLs you use with your bot from HTTP:// to HTTPS:// which is what I assume Legobot does. - Jan Zerebecki 11:00, 15 June 2015 (UTC)
- My guess is that Legobot works because the platform it runs on, the "Wikimedia Labs cluster" or whatever it's called these days, supports "301 redirects". I have copied and tried to run the identical code on my platform, Windows PowerShell, without success. Note, however, that despite claims that now absolutely no HTTP works anymore, the Logging in... POST: request (in the example above) still works for me. If I change that to try to login using HTTPS:, it does not work! Wbm1058 (talk) 12:42, 15 June 2015 (UTC)
- How does logging in via https fail? Can you post debug / log output? - Jan Zerebecki 13:31, 15 June 2015 (UTC)
I just see in my console output:
Logging in...
POST: https://en.wikipedia.org/w/api.php?action=login&format=json (0.82604718208313 s) (0 b)
Login error:
I guess I should see if I can write more diagnostic information to the console. Wbm1058 (talk) 15:20, 15 June 2015 (UTC)
/**
* This function takes a username and password and logs you into wikipedia.
* @param $user Username to login as.
* @param $pass Password that corrisponds to the username.
* @return array
**/
function login ($user,$pass) {
$post = array('lgname' => $user, 'lgpassword' => $pass);
$ret = $this->query('?action=login&format=json',$post);
/* This is now required - see http://bugzilla.wikimedia.org/show_bug.cgi?id=23076 */
if ($ret['login']['result'] == 'NeedToken') {
$post['lgtoken'] = $ret['login']['token'];
$ret = $this->query( '?action=login&format=json', $post );
}
if ($ret['login']['result'] != 'Success') {
echo "Login error: \n";
print_r($ret);
die();
} else {
return $ret;
}
}
So my 'login' 'result' != 'Success' and $ret seems to be null as nothing prints -- Wbm1058 (talk) 15:32, 15 June 2015 (UTC)
I printed out what it should look like when it works, which it does when I simple edit to remove the single character "s"
Logging in...
POST: http://en.wikipedia.org/w/api.php?action=login&format=json (0.86704897880554 s) (146 b)
POST: http://en.wikipedia.org/w/api.php?action=login&format=json (1.4270820617676 s) (190 b)
Array
(
[login] => Array
(
[result] => Success
[lguserid] => 17216044
[lgusername] => RMCD bot
[lgtoken] => bd5ea22dde5601241c9a1da1eeedb5f5
[cookieprefix] => enwiki
[sessionid] => 324aa437fffa48afd8c7ac6db4754a7f
)
)
...done.
Hopefully no beans spilled here. Wbm1058 (talk) 15:56, 15 June 2015 (UTC)
Th sessionid and token are secrets. You should logout with that sessionid to make it invalid.
Perhaps on your PHP installation the openssl extension is not enabled ( http://de2.php.net/manual/en/book.openssl.php ). You could try to run the following as a php script to test if a normal https request works:
error_reporting(-1); var_dump(file_get_contents("https://en.wikipedia.org/wiki/Main_Page"));
- Jan Zerebecki 16:06, 15 June 2015 (UTC)
- I see. My bot logs in every 15 minutes, and each time it logs in it gets a new sessionid. I just changed my pw so now I have a new token.
- Well, of course I don't have the openssl extension installed. I just have a minimum, bare-bones php installation, just what is needed to run my bots and nothing more. I will get right on installing http://www.openssl.org/
- Nothing on the help page HTTPS says I needed to do that!! Wbm1058 (talk) 16:40, 15 June 2015 (UTC)
- So here, do I just need to get the one marked "[LATEST]", or do I need to get more than one of these? It's not clear on that page what I need to do. Wbm1058 (talk) 17:01, 15 June 2015 (UTC)
- The INSTALL file explains how to install this library. Where is the "INSTALL file"? Wbm1058 (talk) 17:07, 15 June 2015 (UTC)
- I guess I need to install WinZip to open up that GNU Zipped Archive file. What a user-unfriendly pain. I clicked the wrong button and it started zipping up every file on my machine. I had WinZip on an older machine of mine, seems that the newer software gets, the more cumbersome it gets. Wbm1058 (talk) 17:38, 15 June 2015 (UTC)
- OK, my 21-day trial version has unpacked the .gz file, and fortunately it only was zipping up my old downloads, so I can probably just recycle-bin them. Now I have found the INSTALL.W64 file:
INSTALLATION ON THE WIN64 PLATFORM
----------------------------------
Caveat lector
-------------
As of moment of this writing Win64 support is classified "initial"
for the following reasons.
- No assembler modules are engaged upon initial 0.9.8 release.
- API might change within 0.9.8 life-span, *but* in a manner which
doesn't break backward binary compatibility. Or in other words,
application programs compiled with initial 0.9.8 headers will
be expected to work with future minor release .DLL without need
to re-compile, even if future minor release features modified API.
- Above mentioned API modifications have everything to do with
elimination of a number of limitations, which are normally
considered inherent to 32-bit platforms. Which in turn is why they
are treated as limitations on 64-bit platform such as Win64:-)
The current list comprises [but not necessarily limited to]:
- null-terminated strings may not be longer than 2G-1 bytes,
longer strings are treated as zero-length;
- dynamically and *internally* allocated chunks can't be larger
than 2G-1 bytes;
- inability to encrypt/decrypt chunks of data larger than 4GB
[it's possibly to *hash* chunks of arbitrary size through];
Neither of these is actually big deal and hardly encountered
in real-life applications.
Compiling procedure
-------------------
You will need Perl. You can run under Cygwin or you can download
ActiveState Perl from http://www.activestate.com/ActivePerl.
You will need Microsoft Platform SDK, available for download at
http://www.microsoft.com/msdownload/platformsdk/sdkupdate/. As per
April 2005 Platform SDK is equipped with Win64 compilers, as well
as assemblers, but it might change in the future.
To build for Win64/x64:
> perl Configure VC-WIN64A
> ms\do_win64a
> nmake -f ms\ntdll.mak
> cd out32dll
> ..\ms\test
To build for Win64/IA64:
> perl Configure VC-WIN64I
> ms\do_win64i
> nmake -f ms\ntdll.mak
> cd out32dll
> ..\ms\test
Naturally test-suite itself has to be executed on the target platform.
Installation
------------
TBD, for now see INSTALL.W32.
Sigh. Now I need to install Perl and Microsoft Platform SDK, then pray. I'm not feeling good about this. Do I need to do all this to get it to work? Is there another way? Wbm1058 (talk) 18:39, 15 June 2015 (UTC)
! Requirements - for windows it might be better to just get the setup at http://gnuwin32.sourceforge.net/packages/openssl.htm
instead of the sources and try to compile 'em yourself ^_~ Thanks, poo flanders! I should have noticed that before! Wbm1058 (talk) 19:37, 15 June 2015 (UTC)
I heard that you only need to enable the extension, and that the Windows version of PHP already comes with everything needed for that, but I never tried it. - Jan Zerebecki 19:54, 15 June 2015 (UTC)
- @Wbm1058:: To unzip gzip files on windows, use the program 7zip. SSL and TLS are alternative names for HTTPS (technically that's lieing slightly, but its close enough for this conversation). Don't worry about root certificate store for now. I recommend against trying to compile php from scratch and link OpenSSL to it (I think that's what you are trying to do above). That's a really complicated way of doing it. The reason why other people's bots work and yours don't is probably because you're running windows, and php on windows is a bit iffy in what extensions it default installs.
- Now for what you need to do, per http://php.net/manual/en/wrappers.http.php (I'm assuming your bot uses php's f_open wrappers to do its communication) you need the openssl extension installed, and at least php version 4.3. You need to enable the OpenSSL php extension (See http://www.herongyang.com/PKI/HTTPS-PHP-Configure-PHP-OpenSSL-on-Windows.html for some instructions). PHP also needs to have access to the libeay32.dll file. This should come with php, but may be in the wrong directory. http://php.net/manual/en/install.windows.extensions.php#install.windows.extensions.overview and http://php.net/manual/en/openssl.installation.php may also be helpful links. Bawolff (talk) 20:39, 15 June 2015 (UTC)
- Hi, nice to see someone else join the party. My library source is at en:User:RMCD bot/botclasses.php. I see two fopen calls there, but both are commented out. Others wrote this, and I have only a limited understanding of how it works. It looks like it uses the Client URL library. I trust that can work under HTTPS. Some of your links are interesting; I'm reading more on those sites to see if I can gain more insight. Thanks, Wbm1058 (talk) 21:45, 15 June 2015 (UTC)
- Yes indeed, it claims to support HTTPS, and php_curl.dll is part of my installation. Wbm1058 (talk) 21:54, 15 June 2015 (UTC)
- Oh yes, libeay32.dll is also required by php_curl.dll so I have that installed already. Wbm1058 (talk) 22:10, 15 June 2015 (UTC)
Good news: This test to connect the yahoo login HTTPS server was successful!
Bad news: I still get the Login error: as shown above, with (0 b) - zero bytes - returned. Wbm1058 (talk) 22:49, 15 June 2015 (UTC)
- So there's two ways to make connections on PHP - fopen, or curl. Looks like you're using Curl. Most of my comment above was assuming fopen. http://stackoverflow.com/questions/316099/cant-connect-to-https-site-using-curl-returns-0-length-content-instead-what-c might be helpful to you. Bawolff (talk)
- Thanks! Very helpful! Now I feel like the solution is in sight... I wonder why they commented out the curl_error lines? After activating them, I see:
- Error: SSL certificate problem: unable to get local issuer certificate Wbm1058 (talk) 00:29, 16 June 2015 (UTC)
- This seems helpful: cURL on Windows. I never did anything special before to install cURL, I've just used what came with PHP (in fact, I didn't realize until now that it was a separate package). Wbm1058 (talk) 00:45, 16 June 2015 (UTC)
Root certificate issue
edit- (This is the root certificate problem we were talking about earlier). You need to download http://curl.haxx.se/ca/cacert.pem , put it somewhere on your computer, then set
curl.cainfo=c:\php\cacert.pem
(replace c:\php\cacert.pem with where ever you saved the file) in your php.ini config file. Bawolff (talk) 04:31, 16 June 2015 (UTC)- Q. Why did the site I linked above tell me to "Rename this file from cacert.pem to curl-ca-bundle.crt"? Wbm1058 (talk) 11:35, 16 June 2015 (UTC)
- A. I see, per http://curl.haxx.se/docs/sslcerts.html
- If you're using the curl command line tool, you can specify your own CA cert path by setting the environment variable CURL_CA_BUNDLE to the path of your choice.
- (This is the root certificate problem we were talking about earlier). You need to download http://curl.haxx.se/ca/cacert.pem , put it somewhere on your computer, then set
- If you're using the curl command line tool on Windows, curl will search for a CA cert file named "curl-ca-bundle.crt" in these directories and in this order:
- application's directory
- current working directory
- Windows System directory (e.g. C:\windows\system32)
- Windows Directory (e.g. C:\windows)
- all directories along %PATH%
- -- Wbm1058 (talk) 11:46, 16 June 2015 (UTC)
OK. The instructions at cURL on Windows moved me forward towards reviving my bot, though I'm not sure all of that was necessary for my application. However, I got to the instruction "Note that if you can not get cURL SSL enabled, you have the option to run cURL in an insecure mode" and I was still getting the same error. So I punted with help I found here, and solved this problem by adding one line of code curl_setopt($this->ch, CURLOPT_SSL_VERIFYPEER, false);
which "helped to circumvent the problem, but totally missed the idea of https and the certification system". But I was able to sleep knowing that my bots were finally running again. Upon returning here, I found the missing piece waiting for me, commented out my "CURLOPT_SSL_VERIFYPEER → false" band-aid, and added curl.cainfo=c:\php\curl-ca-bundle.crt
to my php.ini config file. So everything is happy now, and all that remains is for me to clean up, and update my online copy of the library.
Now I see that the setup at http://gnuwin32.sourceforge.net/packages/openssl.htm is way out of date, and for that reason I probably shouldn't have installed it. I picked one of the "Light" packages from this site (didn't really know which was best) and installed it as well, but I don't know if that was necessary either, or something already bundled in PHP.
One trick to "downloading" the certificate file. All the places I've seen it just have it as a plain-text file that displays in the web browser. I haven't seen any places where you can just download it by clicking a button. I resorted to cut & paste into Notepad. Is there an easier way? When saving it you need to be careful not to save it as "curl-ca-bundle.crt.txt" -- when it's correctly saved, Windows recognizes it as a "Security Certificate" file type. Looking at the "properties" of the .crt file, I see that it opens with "Crypto Shell Extensions". There is also an option to "Install Certificate", which opens the "Certificate Import Wizard" which allows me to import the certificate to a "certificate store" which is automatically determined by the wizard. Apparently it isn't necessary to import it to a store. I see that the certificate is issued to and by Equifax Secure Certificate Authority, and is valid to 8/22/2018.
One remaining question: It seems that the "Bundle of CA Root Certificates" is periodically updated. Will I be required to update this at some point in the future, and if so, how will I know when it's needed? What will happen if I don't update it in time? Wbm1058 (talk) 18:35, 16 June 2015 (UTC)
- Probably not. Only if Wikimedia changes CA's to something currently not in the root certs. That sort of thing doesn't happen very often (Although it does happen). The current root certificate should have been in the bundle for about 7 years now. Bawolff (talk) 03:19, 18 June 2015 (UTC)
- Do the changes to Windows root certificates change anything for us? --Nemo 19:53, 27 June 2015 (UTC)
- No, it only matters (in terms of accessing Wikimedia websites) if either we change our root certificate at some point, which is unlikely in the near future (Or if MS for some reason decided to remove the GlobalSign Root CA cert that we use, but that would be extremely unlikely anytime soon). Although Microsoft's behaviour there is kind of odd... Bawolff (talk) 11:09, 1 July 2015 (UTC)
- Do the changes to Windows root certificates change anything for us? --Nemo 19:53, 27 June 2015 (UTC)
re: "I see that the certificate is issued to and by Equifax Secure Certificate Authority, and is valid to 8/22/2018." -- that's just the first of many certificates in the bundle that Windows shows you. After editing the bundle with Notepad to remove that expired certificate, Windows showed me the next certificate in the bundle, which hasn't expired yet: issued to and by GlobalSign Root CA, valid from 9/1/1998 to 1/28/2028 Wbm1058 (talk) 01:42, 2 October 2021 (UTC)
Updating from PHP version 7.0.12 to 7.0.28
editPHP problem solved |
---|
I'm trying to upgrade from the version I successfully installed in October 2016 using the configuration described in the sections above. I wasn't expecting to run into any major roadblocks as I'm just trying to update for the latest bug and security fixes, and not to the next major release 7.1 or 7.2, but I've run into something I can't figure out. Version 7.0.12 has been successfully running in production since 10/2016. Here is what phpinfo() configuration information tells me: curl
cURL support => enabled
cURL Information => 7.50.3
Age => 3
Features
AsynchDNS => Yes
CharConv => No
Debug => No
GSS-Negotiate => No
IDN => Yes
IPv6 => Yes
krb4 => No
Largefile => Yes
libz => Yes
NTLM => Yes
NTLMWB => No
SPNEGO => Yes
SSL => Yes
SSPI => Yes
TLS-SRP => No
HTTP2 => No
GSSAPI => No
KERBEROS5 => Yes
UNIX_SOCKETS => No
PSL => No
Protocols => dict, file, ftp, ftps, gopher, http, https, imap, imaps, ldap, pop3, pop3s, rtsp, scp, sftp, smtp, smtps, telnet, tftp
Host => i386-pc-win32
SSL Version => OpenSSL/1.0.2j
ZLib Version => 1.2.8
libSSH Version => libssh2/1.7.0
When I tried to update to the current version 7.0.28 by simply overwriting files I saw: Warning: PHP Startup: Unable to load dynamic library 'c:/php/ext\php_curl.dll' - The specified module could not be found.
in Unknown on line 0
The mix of linux and windows slashes was a giveaway that v 7.0.12 doesn't seem to need to load php_curl.dll but 7.0.28 now needs that. So I fixed my php.ini but that just gave me: Warning: PHP Startup: Unable to load dynamic library 'c:\php\ext\php_curl.dll' - The specified module could not be found.
in Unknown on line 0
That file is found in that directory, so it's a kind of spurious / uninformative error message. Per the comments on the curl_version documentation page I disabled the "openssl" extension in php.ini and confirmed that PHP v 7.0.12 and my application work fine without it. PHP cURL has nothing to do with that separated openssl extension but it has its own openssl. Perhaps I'm missing a dll file, as described in the comments here three are needed but I have all of those. Or are either my curl.exe or curl-ca-bundle.crt Security Certificate from June 2015 incompatible with PHP 7.0.28? They were still compatible with v 7.0.12 when I updated to that in October 2016. Help, please. Wbm1058 (talk) 15:36, 14 March 2018 (UTC) SO I tried just retroactively updating from 7.0.12 to 7.0.13 ... curl
cURL support => enabled
cURL Information => 7.51.0
Age => 3
Features
AsynchDNS => Yes
CharConv => No
Debug => No
GSS-Negotiate => No
IDN => Yes
IPv6 => Yes
krb4 => No
Largefile => Yes
libz => Yes
NTLM => Yes
NTLMWB => No
SPNEGO => Yes
SSL => Yes
SSPI => Yes
TLS-SRP => No
HTTP2 => No
GSSAPI => No
KERBEROS5 => Yes
UNIX_SOCKETS => No
PSL => No
Protocols => dict, file, ftp, ftps, gopher, http, https, imap, imaps, ldap, pop3, pop3s, rtsp, scp, sftp, smtp, smtps, telnet, tftp
Host => i386-pc-win32
SSL Version => OpenSSL/1.0.2j
ZLib Version => 1.2.8
libSSH Version => libssh2/1.8.0
Sometime between versions
Version 7.0.18: curl
cURL support => enabled
cURL Information => 7.51.0
Age => 3
Features
AsynchDNS => Yes
CharConv => No
Debug => No
GSS-Negotiate => No
IDN => Yes
IPv6 => Yes
krb4 => No
Largefile => Yes
libz => Yes
NTLM => Yes
NTLMWB => No
SPNEGO => Yes
SSL => Yes
SSPI => Yes
TLS-SRP => No
HTTP2 => No
GSSAPI => No
KERBEROS5 => Yes
UNIX_SOCKETS => No
PSL => No
Protocols => dict, file, ftp, ftps, gopher, http, https, imap, imaps, ldap, pop3, pop3s, rtsp, scp, sftp, smtp, smtps, telnet, tftp
Host => i386-pc-win32
SSL Version => OpenSSL/1.0.2k
ZLib Version => 1.2.8
libSSH Version => libssh2/1.8.0
So I compared php-7.0.18-nts-Win32-VC14-x86 and php-7.0.19-nts-Win32-VC14-x86 and found that the former had 37 items, while the latter has 38. Bingo! The new file added to the package is nghttp2.dll -- so now there is a fourth DLL to be added to the \php directory. Relatively few sites including this one as most discussions on this are stale, but I found one that mentions the latest DLL addition in the answer dated 31-October-2017. It would be too easy if the spurious error message were actually truthful, and said: Warning: PHP Startup: Unable to load dynamic library 'c:\php\nghttp2.dll' - The specified module could not be found.
Of course no mention of the addition of this new file in the release notes. Sigh. Wbm1058 (talk) 00:24, 15 March 2018 (UTC)
|
HTTPS timeout with curl and wget
editRunning these commands:
- curl -s --trace o "https://en.wikipedia.org/wiki/Charlies"
- wget -q -O- "https://en.wikipedia.org/wiki/Charlies"
Results in intermittent timeouts as reported by the debug feature of both programs. It might work 3 times in a row then hang on the 4th etc..
Traceroute shows I am located 10 hops from 208.80.154.224 in Virginia with a 6ms ping time with no dropped packets. I'm on Verizon FIOS.
Problem started the same time HTTPS was implemented.
curl debug (curl -s --trace o "https://en.wikipedia.org/wiki/Charlies"):
== Info: timeout on name lookup is not supported == Info: About to connect() to en.wikipedia.org port 443 (#0) == Info: Trying 208.80.154.224... == Info: Timed out == Info: couldn't connect to host == Info: Closing connection #0
wget debug (wget -d -q -O- "https://en.wikipedia.org/wiki/Charlies"):
Setting --quiet (quiet) to 1 Setting --output-document (outputdocument) to - DEBUG output created by Wget 1.16.3 on cygwin. URI encoding = ‘UTF-8’ Certificates loaded: 170 Caching en.wikipedia.org => 208.80.154.224 2620:0:861:ed1a::1 <-- note: hangs here Closed fd 4 Closed fd 4 Releasing 0x800ce0c0 (new refcount 1).
-- Green Cardamom (talk) 14:49, 17 June 2015 (UTC)
- In case anyone has similar timeout problems, I've worked around it with this:
- wget --retry-connrefused --waitretry=1 --read-timeout=2 --timeout=2 --no-dns-cache -q -O- "https://en.wikipedia.org/wiki/Charlies"
- This is working. Might need to adjust the 2 second delay depending on your connection speed. If there is no response within 2 seconds (--timeout) It will retry 20 times .. but usually succeeds after the first retry. -- Green Cardamom (talk) 15:12, 17 June 2015 (UTC)
- Ping: User:BBlack (WMF). Bawolff (talk) 03:25, 18 June 2015 (UTC)
- I don't think we're seeing this behavior in the general case (stats haven't shown it, and I can't reproduce it -- tried from a few places around the net over both v4 and v6). If this wasn't a transient issue local to you that has since vanished, can you hop on our freenode #wikimedia-operations IRC channel to debug with us at some point? BBlack (WMF) (talk) 12:10, 18 June 2015 (UTC)
- Ping: User:Green Cardamom ^
- BBlack (WMF), it's working OK right now. I'll keep trying and if it starts happening again I'll get on IRC. Thanks. -- Green Cardamom (talk) 22:28, 18 June 2015 (UTC)
- Ping: User:BBlack (WMF). Bawolff (talk) 03:25, 18 June 2015 (UTC)
- In case anyone has similar timeout problems, I've worked around it with this:
Yes indeed, My code is broken!
edit
I have written communication programs to fetch data from Wikipedia and the dictionarys.
(this was done in an environment that is called Caché or MUMPS: http://www.intersystems.com/our-products/cache/cache-overview)
I made them in order to have the possibility to check the contents on errors and much more.
Unfortunately they do not work anymore.
Somebody made the following statement:
For bot maintainers, you have a couple of choices. Either login as the bot and select the preference to not use HTTPS for that account, or update your code to use SSL instead.
How do I realise the preference to not use HTTPS for an account?
Untill now I have tried a lot to fix it but I did not succeed...
Saludos,
--Kvdrgeus (talk) 07:16, 21 June 2015 (UTC)
- Disabling HTTPS is not possible anymore. Your only option is to fix it to work with HTTPS in some way. - Jan Zerebecki 10:23, 21 June 2015 (UTC)
- Hi Kvdrgeus, the bot problems are noted, I hope to be able to put together some relevant information specifically for this. Due to security concerns, we haven't been able to leave the option of opting out open. See here for a technical explanation. /Johan (WMF) (talk) 13:44, 22 June 2015 (UTC)
I finally did succeed in adapting the communication routines to the new TLS-protocols so this problem has been solved...
--Kvdrgeus (talk) 12:43, 26 June 2015 (UTC)
- Excellent, I'm glad to hear that. /Johan (WMF) (talk) 08:45, 1 July 2015 (UTC)