I have a mostly static site running on Ruby on Rails that is using the Varnish reverse proxy cache to save on hits to the Rails backend.
The problem is that a user can login to the site and when they do we use ESI (edge side includes) to show user specific parts of the page.
Using ESI means that we have to disable Gzip compression on the Rails backend (using Nginx+passenger) or varnish cannot parse the data returned from the backend in order to run ESI processing.
My question is, do the benefits of using a reverse proxy cache outweigh the benefits of gzipping all your content? Or should I try to get rid of ESI complete and have the best of both worlds?
You could get the best of both worlds if you arrange things like so:
User -> nginx -> Varnish -> Rails
Turn gzip compression on from nginx to user. That's the slowest segment and also the most costly. I am assuming that your nginx, Varnish and Rails instances are local to each other. Your local bandwidth should be more than sufficient. Besides it does not make too much sense to gzip only to decompress to assemble the ESI.
If bandwidth is not a problem, and load times are acceptable without gzip, you should definitely leave gzip turned off.
Gzipping takes a lot of CPU resources. So, if you're more concerned about CPU than bandwidth, if the site loads fast enough, and if ESI is a great help for you, then definitely the reverse proxy cache system has more benefits than gzipping http responses.
In others situations, where bandwidth is critical, gzip can be more important, but it doesn't seem to be the case here.
Finally, gzipping can be done by some reverse proxies. This is a great possibility, because reverse proxies generally doesn't use much CPU (if they are on a separate server). This saves the backend a lot of CPU cycles, and saves bandwidth as well, but at the moment, and if I remember correctly, Varnish doesn't support gzipping.