-1

I am hosting a big campaign that is going to launch tomorrow. The expected avg. number of visits on the first day is +200 000 users. Assuming everyone is going to visit 5 pages on avg., this makes 1 000 000 views x 20 static files, 20 000 000 requested.

The server I am using is:

Processor: Intel Core i5-2400 4x3.1+ GHz 6 MB L2 - QPI 5 GT/sec
Virtualisation: VT Instructions
Turbo Boost Technology: @ 3.40GHz
Architecture: 64 bits
RAM: 16 GB DDR3
Hard disk: Intel SSD 320 (2x 120 GB)
RAID: SOFT 0/1
NIC: FastEthernet
SwitchPort: 100 Mbps

The script itself doesn't involve many MySQL requests or complex PHP operations. Using HTTP server.

Should I upgrade my server or should this be enough to handle the traffic?

Gajus
  • 831
  • 5
  • 15
  • 27

2 Answers2

2

Have you benchmarked your campaign page with ab, siege, jmeter or similar benchmarking software? Shoot the site with the benchmarking software of your choice, see how fast it is / how badly it kills your server.

The numbers themselves are not that high, so unless your campaign site is a resource hog, there shouldn't be problems.

Janne Pikkarainen
  • 31,454
  • 4
  • 56
  • 78
  • Is there any online service instead? I know blitz.io, but they are using same IP to make all requests, which doesn't make it an accurate test. I've seen http://loadimpact.com/pricing but their pricing is ridiculous. – Gajus Jan 25 '12 at 11:55
  • 1
    Why not ? The source IP is largely irrelevant. The requests are all that matter. – adaptr Jan 25 '12 at 11:57
  • What's wrong with a good old smoke test `ab` etc. can provide to you? They WILL tell you if your site going down or going to survive your big day. – Janne Pikkarainen Jan 25 '12 at 12:05
1

A FastEthernet interface will have problems handling large numbers of connections.

If these requests are spread evenly over a 24-hour period, you are predicting on the order of 230 requests per second; this is not a huge amount, but it depends entirely on how long these requests take to process, and the size of the response.

  • What is the mix of dynamic/static content for these requests ?
  • Is the database accessed sanely, i.e. using persistent proxied connections?
  • Is the database schema designed by a DBA, or an amateur ?

Do some local benchmarks using a web stress utility to figure out the answers to the above, and you will be able to better estimate the load this system can handle.

adaptr
  • 16,479
  • 21
  • 33