12

I intend to use a single VPS to deploy multiple low-traffic CherryPy apps as subdirectories; eg: example.com/app1, example.com/app2, etc.

After researching on WSGI deployment, it looks like the preferred method for deploying apps is to use a WSGI server (Gunicorn, uWSGI, etc) and NGinx in a reverse-proxy setup. It seems like overkill to use two webservers in tandem — especially since my CherryPy app is itself a webserver — but I don't want to dismiss the idea as it appears everywhere. I'm certainly not an expert so I'd like to discuss it.

I see three options:

  • Deploy CherryPy by itself.
  • Deploy beneath Gunicorn or another WSGI server.
  • Deploy beneath a WSGI server and reverse-proxy to NGinx, which seems to be everyone's solution.

My questions:

  • What's the main reason I see this pattern everywhere? Is NGinx just that good?
  • For low-traffic apps, is the native CherryPy server good enough, or should I not even try?

Any and all advice is appreciated, thank you.

Stephen Malone
  • 123
  • 1
  • 6

2 Answers2

11

Why do people put Nginx in the front?

  1. Nginx is an asynchronous web-sever. It means it doesn't dedicate a thread or a process per connection. Instead it uses OS' preferred socket polling library and thus is able to handle hundred thousands of connections. Why should you, as an application developer, care? Because Nginx buffers connections and only passes the request to your CherryPy upstream instance when the request is fully read. The same for responses. This way your CherryPy application, which is a threaded server, behind Nginx in many senses, becomes asynchronous. Specifically, you wave a hand to a slow client problem and slow loris DOS attacks.
  2. Nginx has connection rate limiting out of the box. Say, I don't want more than 8 simultaneous connection from the same IP.
  3. CherryPy has SSL problem. Nginx doesn't.
  4. Python can send bytes back and forth almost as good as C. Python's zlib is just a wrapper around the C library. But because Nginx handles connection effectively it's a good idea to relieve your CherryPy worker threads from serving static content in production and dedicate only on dynamic content.
  5. Multiplexing several CherryPy instances on the same port, domain, path, etc. Generally additional flexibility of another configuration level.

Is it safe to use CherryPy on its own?

According to one of the original authors, yes. Most of web-relevant things you can do with CherryPy on its own.

CherryPy has notion of an application and you can serve several applications with one CherryPy instance. CherryPy also can serve other WSGI applications.

Deploying CherryPy

In a traditional *nix-style deployment you write init script, daemonise you process, drop its privileges, write its PID, etc. It's not a big deal when you have a couple of CherryPy instances. When you have dozens, it becomes tedious and it makes sense to hand over process management to Gunicorn or uWGSI and switch your CherryPy instances from HTTP to WSGI.

I wrote a tutorial/project skeleton, cherrypy-webapp-skeleton, which goal was to fill the gaps for deploying (traditional) a real-world CherryPy application on Debian for a web-developer.

Wrap up

  1. Low-traffic, no special requirements → CherryPy * 1 ⇐ HTTP ⇒ Client.
  2. High-traffic, special requirements → CherryPy * n ⇐ HTTP ⇒ Nginx ⇐ HTTP ⇒ Client.
  3. Dozens of separate CherryPy instances on same server, eager for overkill of everyone's solutionCherryPy * n ⇐ WSGI ⇒ Gunicorn ⇐ HTTP ⇒ Nginx ⇐ HTTP ⇒ Client.
saaj
  • 330
  • 3
  • 11
9

The reason everyone puts nginx (or another server such as Apache) in front of their app servers is that everyone has static content such as images, CSS and JavaScript, and strange requirements which are unique to their application.

Your app server (CherryPy, gunicorn, whatever) is optimized to run your app and serve its output. While the app server can also serve static content, they are almost never well optimized for this task, since it's secondary to the main purpose of the app server. (Some app servers will also help by minifying and compressing your CSS and JS, so that the web server in front can serve these resources even faster.)

In addition, the actual web server can do much more than high performance content serving. Things like caching, header manipulation, URL rewriting, geolocation, and many other features that would just bloat the app server to no good purpose.

Typically you would run the app server alone only when developing the application, when you are the only user, and performance is not an issue. Even if your site is low traffic, you would like it to be faster, right? Low traffic sites which are slow don't generally grow into high traffic sites...

Michael Hampton
  • 237,123
  • 42
  • 477
  • 940