I've got a bunch of static files (eg index.xhtml
) in an Apache2 web root. I don't have control over the server's configuration, but am allowed to modify .htaccess
in the web root.
I would like to pre-compress the files (eg index.xhtml.gz
) to improve load times and reduce bandwidth consumption. However, if I do this, user agents which do not support auto-detecting content encoding will be unable to work with the site.
I assume that these agents will be very rare compared to capable agents, so the content should be served decompressed only if the agent doesn't send gzip
in the Accept-Encoding
header. Agents which claim to support gzip but don't are of no concern.
Most sites regarding compression assume it's being performed on-the-fly, which I'd like to avoid to reduce consumed CPU time.
AFAIK, only if you have access to run either a CGI script on the box or if you hack Apache.
But, the common practice is not to do what you're asking. The common practice is to store the files uncompressed, and then use mod_deflate to compress on the fly.
That's in my httpd.conf, it'll have to change somewhat for .htaccess probably.