Want to make pages load in a blink.
My front page takes 1.1 s to load with ping 74 ms. A similar (ping 72 ms) English wikipedia page (actually 40% larger document on a heavily used host) loads in 0.4 s. My site runs Mediawiki software under APC on a dedicated host with 1 GHz Athlon, there barely is any traffic and the machine has some free core memory.
Is there a way to speed up the load, or anything I could look at? I've tried squid and there was no improvement at all.
The test I used was time wget url
Thanks.
P.S. I've asked similar question, but I think it was not stated clearly enough.
You must determine your bottleneck and remove it.
Utilize caching techniques, such as your mentioned APC, memcached, tune your DB config, for example, if you use MySQL, enable query-cache in my.cnf.
Get a better Internet connection for your server, when developing, use techniques such as CSS sprites [1]. for minimizing number of separate HTTP connections for downloading the content (where possible). Download and see what YSlow has to say about your site [2].
"YSlow analyzes web pages and why they're slow based on Yahoo!'s rules for high performance web sites."
We can speculate all day why the page isn't loading quickly... or you can enable profiling within Mediawiki. This should enable you to see how much time PHP, APC, Mysql etc take to generate the page. That should give you enough data to figure out what to do next. I do suspect that since your server isn't busy it's not serviing anything out of cache since you're the first requestor.
http://www.mediawiki.org/wiki/Profiling#Profiling
Some things to consider:
If you're using Firefox, use Firebug's Net panel and YSlow to figure out where your page is stalling. (I once took pride in having a front page that was tuned to fit inside a single Ethernet frame, but that was well before Web 2.0.)
Wikipedia caches all pages for non-logged in users. You are seeing an in-ram, or on-fask-disk cache. Login and load a page, and it will take longer to load. That being said -- there is plenty of MySQL tuning that can be done with globs of ram to make MediaWiki run reasonably fast.
See also, media wiki servers, and caching strategy. On the first page they claim a Squid cache hit rate of 78%. They store rendered pages in a huge 6gb memcache store, and use 6gb of memory to cache the database (99% cache hit rate).
A default media wiki without this sort of caching will have the horrible performance you are seeing. Wikipedia is MediaWiki through the fascad of 5 layers of caching and globs of ram.