47

Not going into specifics on the specs since I know there is no real answer for this. But I've been doing load testing today with the ab command in apache.

And got to the number of 70 requests per second (1000 requests with 100 concurrent users), on a page that is loading from 4 different DB tables, and doing some manipulation with the data. So it's a fairly heavy page.

The server isn't used for anything else for now and the load on it is just me, since it's in development. But the application will be used daily by many users.

But is this enough? Or should I even worry (just as long as it's over X requests a second)

I'm thinking that I shouldn't worry but I'd like some tips on this.

Ólafur Waage
  • 676
  • 1
  • 7
  • 10

5 Answers5

49

70 requests per second works out to an hourly rate of 252,000 page renders / hour.

If you assume that the average browsing session for your site is 10 pages deep, then you can support 25,000 uniques / hour.

You should probably check these numbers against your expected visitor count, which should be available from the folks on the business side.

Many of the sites I work on see about 50% of their daily traffic in a roughly 3 hour peak period on each day. If this is the case with your site (it depends on the kind of content you provide, and the audience), then you should be able to support a daily unique visit count of around 150,000.

These are pretty good numbers; I think you should be fine. It's wise to look into opcode caching and database tuning now, but remember- premature optimization is the root of all evil. Monitor the site, look for hotspots, and wait for traffic to grow before you go through an expensive optimization effort for a problem you may not have.

Tim Howland
  • 4,678
  • 2
  • 26
  • 21
  • Very good answer, exactly what I was looking for. I have not done any optimization and wanted to get baseline numbers to see where I was sitting at the moment. – Ólafur Waage May 14 '09 at 15:00
  • "If you assume that the average browsing session for your site is 10 pages deep, then you can support 25,000 uniques / hour"... under the assumption that every unique sends a request every second during the session. If a visitor needs to think before a next click/request - for example 5 seconds - you'll be able to support more visitors in parallel sessions. – Jochem Schulenklopper Jun 05 '15 at 15:55
  • I totally agree about "premature optimization" from a code perspective. But, choosing a flawed design as a whole is actually the root of all evil. Nothing can overcome a flawed design except a rewrite. – Jeff Fischer Jul 11 '16 at 23:58
6

I have used 2 tools to watch the performance of my apache servers in the past.

One is munin, which graphs all sorts of things including number of apache instances, number of connections, available memory, processor usage, etc - and helps me determine when I am approaching a danger zone, and why.

The second one is simply the apache server-status page (http://your_server/server-status?refresh=10) which lets me see the state of each connection, along with how many free connections are available at any given moment.

Guido
  • 127
  • 1
  • 9
Brent
  • 22,219
  • 19
  • 68
  • 102
1

I'd suggest you worry only if you think your app will be very busy when it hits the ground. Is the page in question likely to be hit that hard? Harder? Less? If you have no idea, I would suspect that it's unlikely to be a problem earlier on. If it's your slowest page you'll know one place to look if you have to optimize the system later.

There are also lots of things you can do to tune most web servers and database engines to squeeze more performance out.

acrosman
  • 201
  • 2
  • 5
  • I'd like to be prepared for a little rush of requests, for example a very quick page can handle around 110 requests per second. While the server can handle 2900 rps on an empty page. – Ólafur Waage May 14 '09 at 12:48
0

Once you put your site live you could also look at mod_top1 which will give you a real-time view of current load on Apache. I've not installed it myself but it certainly seems to have more information and a better breakdown of load than the standard Apache server status.

0

You state in a comment that your server can handle 2,900 requests per second on an empty page. That indicates pretty strongly that it's not the webserver itself - it's the processing.

If you're using PHP, consider an opcode cacher like APC. If the DB is a bottleneck, memcached will help you as well.

ceejayoz
  • 32,469
  • 7
  • 81
  • 105