7

When trying to download a large file (not sure what size is enough, trying to download 5Gb), connection is stuck:

$ wget --verbose http://example.net/large.zip -O /dev/null
--2016-12-14 12:52:38--  http://example.net/large.zip
Resolving example.net (example.net)... 1.2.3.4
Connecting to example.net (example.net)|1.2.3.4|:80... connected.
HTTP request sent, awaiting response...

And this is going on forever, and by forever I mean 10 minutes at least.

nginx config for this kind of files:

location ~* ^.+\.(css|js|ogg|ogv|svg|svgz|eot|otf|woff|mp4|ttf|rss|atom|jpg|jpeg|gif|png|ico|zip|tgz|gz|rar|bz2|doc|xls|exe|ppt|tar|mid|midi|wav|bmp|rtf)$ {
   access_log off;
   log_not_found off;
   expires 7d;
}

Small static files (and by small I mean for example 50Mb ones) are served fine.

Turning access_log off; to "on" does not help, there is nothing in logs on this request.

Changing file type from zip to another type does not help.

The most strange thing is that if I start the download and after this restart nginx, the download starts just fine. But only this one. If you start another download after server restart, it doesn't work in the same way.

Andrey
  • 201
  • 1
  • 2
  • 5

3 Answers3

7

For me the remedy were these two settings:

In the file: /etc/nginx/nginx.conf

Add:

proxy_max_temp_file_size 0;
proxy_buffering off;

Between the lines client_max_body_size 128M; and server_names_hash_bucket_size 256;:

http {

client_max_body_size 128M;
proxy_max_temp_file_size 0;
proxy_buffering off;
server_names_hash_bucket_size 256;
algenib
  • 171
  • 1
  • 1
  • 3
    `client_max_body_size` was enough for me. I had an app in Django with Gunicorn, and on Nginx access.log, `504` was present always and I thought that maybe, some timeout on Nginx or in Gunicorn, or even `limits.conf` (a lot of files were being requested), but there was a process on my app for generating PDF files containing dozens of images, and the report, had 171Mb. That's my story :). – ivanleoncz Oct 09 '19 at 18:28
  • 2
    `client_max_body_size` was enough for me too! I use nginx as a reverse proxy and it was failing when the backend sent huge images. Now it works perfectly! – lucaferrario Mar 05 '20 at 10:33
4

The answer appears to be in these links. One, two. Basically, disable the disk cache

location / {
  proxy_max_temp_file_size 0;
}
Tim
  • 30,383
  • 6
  • 47
  • 77
1

Oops my mistake.

There was a CDN configured in front of this server. And it needed too much time to get those files completely before it started to serve them. That's why the download started instantly when I restarted nginx. CDN was giving the incomplete file.

Andrey
  • 201
  • 1
  • 2
  • 5
  • Ah, interesting. This is why explaining the whole environment in a question is good. Please accept your own answer. – Tim Dec 15 '16 at 18:05
  • 1
    Well I ALWAYS forget this particular detail about this particular server. Next time I need a checklist with the only item "Is it because of CDN?" – Andrey Dec 16 '16 at 07:07