5

I'm trying to set up a HTTP streaming server I wrote with Tornado and python. Basically, it keeps the connection alive and occasionally flushes information out. It's a bit like long polling, except the connection isn't broken by the server.

Is it possible to put something like this behind nginx? I'm testing it from my browser, and I can't see any output until the server breaks the connection, then it's all sent at once.

Paul
  • 153
  • 1
  • 3

2 Answers2

7

You need to turn proxy_buffering off for the streaming requests. If all requests to the backend will be streaming, you can just use proxy_buffering. As it states in that entry, you can also manage buffering on a per-request basis by having your backend include an X-Accel-Buffering header to turn buffering on or off.

kolbyjack
  • 7,854
  • 2
  • 34
  • 29
0

Just a guess. Is the tcp_nodelay to Off? It is on by default unless turned off. Nginx Documentation

Sameer
  • 4,070
  • 2
  • 16
  • 11
  • Hm, that didn't do it. I should also mention that the Tornado app acts as expected when I access it directly, so I'm guessing nginx is doing some kind of buffering? – Paul Apr 25 '11 at 03:59