2

My understanding is that nginx is well-suited for serving static content. But i can't find any information with regard to how well it's for serving very big amount of static content. Let's say i use nginx as a web server and node.js as an application server then i can set up nginx as reverse proxy and load balancer so it gives me the ability to spawn multiple node.js instances (for example under reverse proxy i can have let's say 5 running node.js apps). The architecture would look like this image here. As i understand this way it's pretty easy to scale a website for serving dynamic content (all you need to do is to spawn multiple nodejs apps) and nginx is serving static content. But what if i have tons of requests to nginx itself for static content and single nginx can't just handle them all. I guess i can spawn multiple nginx instances but it seems to me it would be inefficient. I can make nodejs apps to serve static content but as i understand due to it's single threaded nature of nodejs it would be inefficient as well. So do you have any thoughts on how to manage nginx for big amount of static content? Or provide with any useful information

jerichofs
  • 41
  • 3
  • You would use a load balancer in front of multiple nginx instances. – Michael Hampton Jan 06 '19 at 14:59
  • So I have to use multiple instances of nginx anyway. Can you provide with more information please? If i use multiple instances, then I have to use different proxy like haproxy right? – jerichofs Jan 06 '19 at 17:53
  • First thought, you could either increase machine resources where nginx is residing. Or you could do DNS roudn robin balancing to multiple Nginx machines. – titus Jan 06 '19 at 19:50

2 Answers2

1

From the nginx documentation, there is "Optimizing Performance for Serving Content" you might try. https://docs.nginx.com/nginx/admin-guide/web-server/serving-static-content/

You might need to predict or state how many heavy loads is considered as heavy to your system.

Do performance benchmark and check whether with those optimizing the system still cannot handle the request, if so, adding more nginx is the only way out I think.

Notes:

  • You also might try to use AWS S3 or Google Cloud Storage to serv static content.
  • Use CDN: Cloudflare or Cloudfront AWS
victoroloan
  • 196
  • 4
1

Any large static content system, certainly any called a content delivery network, will have multiple (web server) nodes, for various reasons:

  • Multiple locations physically close to users lowers latency.
  • Network interface speeds and memory bandwidth is limited.
  • Storage IOPS are limited.
  • High availability may be desired.

Know the limits of a single node. Push one node as far as you can in test and production, trying various configurations and sizes. When that is exceeded, scale out to multiple nodes with your choice of load balancing solution: load balancing style proxy, DNS, or network routing tricks. Or give a commercial CDN service a try.

Any "inefficiencies" in running lots of web servers do not matter if the aggregate resources of the CDN meet the requirements. It helps the scale out approach that medium sized VM instances are generally inexpensive, and you can spin up many of them.

John Mahowald
  • 30,009
  • 1
  • 17
  • 32