I have an nginx reverse proxy in front of a node.js backend server. In my node app, I am able to stream responses as they become ready, so that the client can start downloading resources referenced in the <head>
section of the HTML before the entire response is received.
But, when I put nginx in between, the whole response is buffered before being sent to the client. From this answer, I understand that I can disable this by setting proxy_buffering off;
in my nginx config. However, the nginx docs explain that without proxy buffering, a slow client makes the node backend wait (which is why nginx buffers by default).
What I want is the best of both worlds. I want nginx to receive a response and immediately start streaming it to the client. If the client is slower than the backend, nginx buffers the response from the backend and feeds it to the client while the backend is free to process other requests. Oh, and I also want nginx to gzip the streamed response for me on the fly.
Is the configuration possible? It seems like with all of the performance advice on early flushing, I can't be the first person to want this kind of setup. I scanned the nginx docs on the proxy module, but I couldn't find a setting to accomplish this.
Also, on the gzip side of things, it seems like the way to combine streaming with gzip is to do chunked transfer encoding which nginx supports. But, apparently HTTP/2 no longer supports this. What's the new solution then? Or, does it "just work" when you tell nginx to apply gzip to a stream.
0 Answers