I'm trying to set up a HTTP streaming server I wrote with Tornado and python. Basically, it keeps the connection alive and occasionally flushes information out. It's a bit like long polling, except the connection isn't broken by the server.
Is it possible to put something like this behind nginx? I'm testing it from my browser, and I can't see any output until the server breaks the connection, then it's all sent at once.
You need to turn proxy_buffering off for the streaming requests. If all requests to the backend will be streaming, you can just use proxy_buffering. As it states in that entry, you can also manage buffering on a per-request basis by having your backend include an X-Accel-Buffering header to turn buffering on or off.
Just a guess. Is the tcp_nodelay to Off? It is on by default unless turned off. Nginx Documentation