5

Possible Duplicate:
How do you do Load Testing and Capacity Planning for Web Sites

I have a personal webserver (dedicated) which has 2GB RAM, 500 GB HDD, Unlimited Bandwidth.

Currently there are about 15 sites. Some sites get about 1000 hits per day and the lowest ones gets about 150 per day.

Most of the sites are wordpress blogs and others are asp.net applications. All the sites are not "heavy".

My question is how many more websites can my webserver handle? I am webdeveloper so what are the things (sever spec) which I should consider before renting out a server for a webstite?

Shoban
  • 375
  • 1
  • 4
  • 10

4 Answers4

5

There's no definitive answer for that question as it 100% depends on load and feature usage of those sites. Also, consider that load might change over time, so you can't plan completely by just looking at current usage patterns.

The best advice I can give is to monitor the current usage and try to come up with some average numbers of load/visitors. Then plan for a safe margin you want to have, and rent websites / buy servers accordingly.

Mark S. Rasmussen
  • 2,108
  • 2
  • 21
  • 31
2

Look at your processor usage and resource consumption during peak times and throughout the day. Extrapolate from their to how many more sites would bring the server to no more than 75%, at the most. Keep on monitoring this number as you add more sites.

You don't want to push the limit, since the less of a threshold you have, one site that has a big day or some inefficient code (ie: the site becomes "heavy") can bring down your machine if you are not careful.

Yaakov Ellis
  • 556
  • 1
  • 10
  • 15
2

I may be absolutly wrong here but from the little experince i have had dealing with dedicated web servers i can say hosting websites is not nearly as resource intensive as some people think.

If you have a at least a dual core the CPU bottleneck should be fine for a while so probably the bottleneck you would be worried about is the HDD bottleneck which is fixed by the ol "Throw ram at it" move.

The only real way to know is watching the stats and when you start seeing that a bottleneck is forming, people are getting timeouts or running out of resources then its time to do a bit of upgrading or consider outsourcing.

Shard
  • 1,432
  • 4
  • 21
  • 35
2

If you are serving static contents (either html, or html generated by something like moveable type) then the answer is, for any contemporary hardware, a lot. Where a lot is probably measured in tens of millions of static requests per day.

In that kind of setup, the first limitation you'll run into will be the size of your servers connection to your data center. Most dedicated server vendors will start you off with a 10mbit connection, which is probably the first thing that will max out if you approach the number of requests quoted above. Generally they will switch you to a 100mbit port for little or no change, but be aware this just means you have a 10x increase in how quickly your bandwidth cap (if you have one) can be exhausted. Pay close attention and monitor your monthly useage closely, lest you pay large overage fees.

So, once you've got a 100mbit connection the next potential problem will be the speed of getting your data from the hard drive to the network. Even at 100mbit that is still only 12mb per second from the hard drive, which is trivial for contemporary hardware. Given a decent amount of free memory (for disk cache) and a good mix of file sizes (from a few hundred bytes for your favicon.ico to a few hundred kb for a big photo) you'll still probably cap out a 100mbit connection before hitting serious load.

However, all of this is assuming a site that serves static content, which is almost never true. If you're using a web framework like Django, Rails, Grails or any of the hundreds out there then your first bottlekneck will be CPU, the second will be memory, and the third will be the amount of concurrency your application can handle.

Dave Cheney
  • 18,307
  • 7
  • 48
  • 56