-1

Update: Comments made regarding how it is not a duplicate - fundamental question.

Pre-Info: I am a game design specialized generalist with semi-practical software engineering skills in essence yet due hard times we're going through I am the one who gets it done (set-up and passed all security, w3 standard tests, programmed the site and set-up the dedicated server, compressed etc. but managing high-volume of request is very serious).

Actuals: We have a fan-comics web-site that will receive high amount of traffic for a single day (presumably 25K - at max. 250K unique user, timing unclear). Currently using a single IP - dedicated server:

Intel Xeon E3-1230 x1
Cores: 4x 3.2 GHz 
RAM: 16 GB DDR3 ECC
HDDs: 2x 1TB SATA 7.2k RPM
Conn: 100 Mbit Unmetered Uplink based France.

Dedicated server is running Ubuntu 14, Apache, PHP and MySQL for Wordpress. Average data per page is 2 MB (about 30-40 requests per page) and whole experience would sum about 10 MB per user. Apache should be capable to run ~500 request delivered p/s under current set-up (yet I am not sure of real results, it may be way off the charts / due network limitation, it'll probably be 100-150 concurrents to provide the 'okay' experience without CDN).

My question is: What if I buy and set-up a CDN service and cached all the assets; how server-sided processes are going to work? There are no dynamic requests running on the server other than Contact Form and majority of the web-site is static yet:

1- I'd like to learn in under which conditions CDN will require server to make queries or render outputs?
2- And let's say there are 20.000 concurrent request to CDN, all static assets served through it, what processing power or the capacity would my dedicated server require?
3- Alternatively: How would a queue work? (keeps track how many active & serves to client as resource become available)

I have done my research even it's not the very extensive one; and I am not quite sure if I asked the correct questions yet I'd very thankful for your support.

  • I understand your point. I believe it is not a duplicate, here is why: - Requested practical data from people who are experienced and has knowledge of the numbers (which may proportionally estimated and cast&adapt for current case) - Concept of the queue is a matter I don't know but come up with; it may be invalid regarding current technologies and approaches. - Would like to hear from experienced about CDN and Dedicated server combination with such barebone set-up & how it works regarding data flow. - Single day of peak is an edge case to solve. – Atahan Bozkurt Dec 19 '17 at 23:07
  • 1
    The answers to your questions are all "it depends". The conditions in which the CDN hits your server will vary depending on how dynamic your content is, how long you cache stuff for, etc. The processing power required will depend on the efficiency of your code and how complex its tasks are. Queues can work in a number of different ways. – ceejayoz Dec 20 '17 at 01:07
  • Hello @ceejayoz, I were doing further researchs (as I declared it is not my skillset) and let me share what I've concluded: - It is not dynamic except texts. I've calculated caching interval (1 week), server RAM usage (since it uses ineffective apache mlm prefork), connection capacity (adding delays for closing connection - freeing RAM), per second connections (considering timeouts and delays mentioned above) and have evaluated all options out there (integrations, cost-effective CDNs, extras etc.) so decided to use Fastly (to serve css,js,images) and around 500 conn. p/s for db is viable. – Atahan Bozkurt Dec 20 '17 at 01:13
  • Which is may scale up to 50K unique users per minute (content consumption rate) and it is worst dystopian scenario (has below 1% chance) So, I understand how it is a duplicate and I'll be sharing the set-up and data after I pass the high-traffic day hoping it is well calculated and won't fail. – Atahan Bozkurt Dec 20 '17 at 01:15
  • 1
    If you have a known day of high traffic, run a load test simulating the expected amount. Sites like Blitz.io can help with that. – ceejayoz Dec 20 '17 at 01:27
  • 1
    I'd hope the CDN support staff of whatever CDN you will be using can help you a bit further. As for running the tests, there are less expensive ways to do that - run up a number of DO droplets or AWS/EC2 systems and do httperf or something similar on them. But the sad truth is your employer will have to budget time and/or money to get this done right. – Jenny D Dec 22 '17 at 09:10

1 Answers1

0

For your static assets (images, style sheets, JavaScript) you can set proper HTTP caching headers which you can use to control how frequently the CDN connects to your server to fetch content. This will mostly save bandwidth, not that much CPU.

However, every site has different requirements, and we cannot give authorative answers if your dedicated server can handle the load. You need to perform the research yourself or hire a professional to do it. Basic principles are described in Can you help me with my capacity planning?.

Tero Kilkanen
  • 34,499
  • 3
  • 38
  • 58
  • I've already read; reasons are being not a directly fundamental question is explained as a comment to original question. In a medial fashion yes it is "capacity planning" yet I've already done it for the "common usage estimation" but I lack information how I should handle "rare peak day" case. – Atahan Bozkurt Dec 19 '17 at 23:12