Talk:404 handler caching

(Do discussions about the subject go here, or in the article itself?)

The described mechanisms could certainly work. However, why do manually what Squid is built to do?

About the problem with logged in users: Does anybody know how large the proportion of overall work is put into serving pages for people where the page cannot be cached? In other words: How big is the problem actually? If it's big, then I certainly think that some client-side technologies should be employed, even if it means that logged in people must accept javascript, cookies, etc. - At least until a technology like ESP has been incorporated and tested, see http://www.aulinx.de/oss/code/wikipedia/

Troels Arvin 19:18, 2004 Jun 2 (UTC)

First, I don't anticipate adding an extra daemon just to do caching, and I doubt other MediaWiki users would prefer that.
Second, I don't know much about Squid, but I do know that HTTP caching tends to work better with static files than with dynamically-created streams. So I'm wondering if Squid would actually work better with 404 handler caching than what we have now. --Evan 20:48, 3 Jun 2004 (UTC)
Key to proper HTTP caching is relevant HTTP-headers. It doesn't matter if the content is dynamically created or not. - And the subject of proper HTTP headers is something that you need to handle anyway (although it's too often overlooked; some webmasters seem to want to forget that not all user agent communicate directly with the HTTP server).
My main objection to the 404 handler idea is that I think it's a bad strategy to manually add code for which there already exists a good, dedicated solution. The reverse proxy solution (e.g. using Squid, as now) has the added advantage that it's easy to diffentiate caching: Some URL types may need another caching profile that other URLs. With a reverse proxy solution, this is a simple manner of adjusting the sent HTTP headers. If you do that by way of static files (perhaps generated by a 404 handler), then you need to do all sorts of tricks with Apache and file suffixes.
About non-Wikipedia uses of Meta-Wiki: Of course, it would be nice to have a MediaWiki which works perfectly out of the box in all situations. However, you tend to get bloated and overly complicated products if you try to make them scale to all demand profiles. I think that the focus for MediaWiki should be on the main functionality of being an advanced Wiki: Handling articles, changes, users, etc. Users of MediaWiki who experience performance problems may then take advantage of common solutions for such performance problems. A reverse proxy solution is one of the common tools to employ in such a case.
--Troels Arvin 08:06, 2004 Jun 4 (UTC)


So bad, the following links are dead : http://wikitravel.org/~evan/wormwiki-0.1.tar.gz -- a proof-of-concept Wiki engine that uses this technique for showing pages.

Start a discussion about 404 handler caching

Start a discussion
Return to "404 handler caching" page.