0

I'm going to release a website in a week or so, and I'm gonna upload it first on a public server. My question is, if I get many users, how would I add more servers (probably dedicated) to the site? And also, when should I start thinking of adding more servers, how do I know if the servers can't handle the traffic anymore?

John Doe
  • 33
  • 2

1 Answers1

1

Well to answer your first question, depending on the platform your site is running on and the type of site it is you could either separate the tasks to different servers, or create duplicate servers that are behind a load balancer.

For instance, a simple static site can be expanded by simply copying the content to additional servers, then setting your URL to point to a smaller load balancer server that simply passes the incoming connections to each of your servers in a round robin fashion, thus each server adds the capacity to handle an extra number of users equal to what your first server could handle on it's on (assuming they're all the same).

                         _________________
                         |               |
                         | load balancer |
                         |_______________|
                      /           |       \
                     /            |        \
                    /             |         \
            __________         __________    \ __________
            |        |         |        |      |        |
            |   web  |         |   web  |      |   web  |     etc...
            | server |         | server |      | server |
            |________|         |________|      |________|

Now if it's a more dynamic site that includes reading and writing to a database and users interacting with each other, it's just a little more complicated. Basically, you would separate all of the single server's tasks into multiple servers to distribute the load. For instance, you would have one (or multiple -- using replication) database servers that would supply your front end web servers with content. Then you can have multiple web servers that are serving the actual html/php/whatever but are getting their content from the database server(s). Still using a load balancer to split the traffic between multiple web servers, so even on a dynamic site, everyone would see the same thing because all web servers would get their data from the same database environment. Now for scaling the database servers specifically, basically you have one "master" server where all of your writes happen, then it replicates it's changes down to a bunch of "slave" servers where you do all your reads. Generally, any site does many more database reads than writes so this works and lets you distribute those reads over as many slave servers as you need.

                         _________________
                         |               |
                         | load balancer |
                         |_______________|
                      /           |       \
                     /            |        \
                    /             |         \
            __________         __________    \ __________
            |        |         |        |      |        |
            |   web  |         |   web  |      |   web  |     etc...
            | server |         | server |      | server |
            |________|         |________|      |________|

                 ^                 ^               ^
     each web server writes to the master and reads from the slaves


                         _________________
                         |     Master    |
                         |  (for writes) |
                         |_______________|
                      /           |       \
                     /            |        \
                    /             |         \
            __________         __________    \ __________
            | slave  |         | slave  |      | slave  |
            |(for    |         |(for    |      |(for    |     etc...
            | reads) |         | reads) |      | reads) |
            |________|         |________|      |________|

Now to answer your second question, to see when you need to move to more servers is pretty simple. There are a few different things that can be depleted on a server -- Available CPU cycles, Memory space, Disk space/bandwidth, and Network bandwidth. There are tools to look at each of these. Assuming this is a Unix based OS, you can use "top" to monitor CPU usage and memory usage, "df -h" to see disk usage, and "iftop" to monitor network usage. If any of these are constantly running against the top edge of what you have available, and you are noticing slowdowns in the usage of your website, it's time to get another server(s). And, based on what you see you are running out of, you'll know how to spec out your new servers to better handle those loads.


Hope that helps!

Ben Baron
  • 415
  • 6
  • 9
  • But if I had one server used as a database, how would I connect it to another server? – John Doe Mar 24 '11 at 17:58
  • So you mean if you already have a web server and a database server, how to you expand to more database servers? Now my experience is specifically with MySQL, but I'm sure it's similar with other DBMS's. It's actually pretty simple, but it requires a little site downtime (can do at 4am or something). 1. Turn off db server. 2. Copy the data files for the database to new slave server. 3. Configure new slave server to replicate from original master server. 4. Turn on master. 5. Turn on slave. Now they will be in sync and the slave will get all changes from the master. – Ben Baron Mar 25 '11 at 20:09
  • Then, any time you need to add more slaves (or do backups) you don't have to turn off your master server. You just turn off the slave server, clone the data to another server (becomes second slave) and turn them both back on. You can create many slave servers this way. So then you can have different slave servers handle the read loads for different processes/parts of the site and for backups. – Ben Baron Mar 25 '11 at 20:10