I've got a bunch of static files (eg index.xhtml
) in an Apache2 web root. I don't have control over the server's configuration, but am allowed to modify .htaccess
in the web root.
I would like to pre-compress the files (eg index.xhtml.gz
) to improve load times and reduce bandwidth consumption. However, if I do this, user agents which do not support auto-detecting content encoding will be unable to work with the site.
I assume that these agents will be very rare compared to capable agents, so the content should be served decompressed only if the agent doesn't send gzip
in the Accept-Encoding
header. Agents which claim to support gzip but don't are of no concern.
Most sites regarding compression assume it's being performed on-the-fly, which I'd like to avoid to reduce consumed CPU time.