1

I am running a LAMP Application installed on one server, happily serving about 1M PI per month. Now I am looking into a potential partnership where my app might serve around 50-80M requests per month.

This is how my architecture looks like: enter image description here

So images are served from static.domain.com while the application is served by www.domain.com. The API where 90% of the traffic comes from is under a seperate https domain api.domain.com but queries the mysql and solr stack.

Would one root server with 128GB RAM and SSDs in SW-Raid 1 with an Intel® Xeon® E5-1650 v3 Hexa-Core Haswell be able to handle that load? Most request will go against solr and deliver just a json feed, potentially not hitting the "disc".

Is this overkill with 128GB of RAM or will one server not even be able to handel that load? I could also go with 2 servers and load balance. Question is how within this architecture.

Thank you for any hint on this.

merlin
  • 2,033
  • 11
  • 37
  • 72
  • What sort of load/perofrmance monitoring are you currently performing? There are a number of tools available to help gauge resource usage and allocation over time. One quick way to check would be to log performance over the course of a day and see what percentage of resources are being used. Check to see how resource usage scales with website load and you should be able to work out if your hardware will be sufficient. – Matt Aug 07 '15 at 18:29

2 Answers2

6

I have a piece of string, how many knots can I tie in it?

Really it's down to monitoring how much active load there is on the server and then simulating additional load to push it to a point where you can say for certain.

For math... 50M requests/month, assuming 30 day month works out to about 19-20 requests per second... I'd guess it's possible, depends on exactly how much work solr needs to do and how much overhead your app adds.

Kaithar
  • 1,025
  • 6
  • 10
  • yes, I am counting with about 40-50 as not all 24h are at full load. However, the other commentar broad me to the point that load balancing such a setup might be a good idea. – merlin Aug 07 '15 at 20:35
  • @merlin agreed, though exactly how it smooths depends mainly on how distributed your userbase is. My observation is you tend to only get ~6 hour lulls with Europe, Americas and Australia summed. Personally I'd go for load balancing on something of this scale purely for reduced downtime... even splitting it on to two servers gives you a better ability to avoid downtime for upgrades. I didn't mention it because you didn't seem to be asking in that direction heh. – Kaithar Aug 07 '15 at 20:40
5

Can the server handle it? Yes.

but..

Whether it actually will depends completely on your code. The servers you're viewing this page on currently handle about 75 million requests per month each, and they're pretty bored while doing it, peaking at about 10% CPU. So it's definitely doable.

If you want to scale like that, you'll need to make sure that your application code scales to what you need it to, and might need optimization. Do load testing to make sure that your different application components can scale as you expect them to.

Shane Madden
  • 112,982
  • 12
  • 174
  • 248
  • 3
    Just a point of clarification: Server Fault runs on servers that are load balanced. Not on a single server. However, your point remains: Good code is a big part of the game, and if you really want to only use a single server (I won't talk about why that's a bad idea in this comment), then with the specs the OP gave, it can be done. – David W Aug 07 '15 at 18:38
  • 2
    @DavidW Yeah, I averaged the 670m requests per month among the servers to account for the load balancing; but it's a good point that it's a bad idea to just run on a single server for redundancy reasons. – Shane Madden Aug 07 '15 at 18:42
  • The code should be OK. I am just wondering if it is smarter to rent that high performance server or get two with less RAM and "somehow" loadbalance it. I learned from your comment that you would prefere that. What kind of load balancing do you have in mind for such a setup? – merlin Aug 07 '15 at 18:50
  • @merlin There are tons of options for load balancing. We use HAProxy - you could use Varnish if you want a cache too, or Apache or nginx, or any number of commercial solutions. – Shane Madden Aug 07 '15 at 19:56
  • @shanne-madden Varnish sounds nice, as most of the requests of the API will be identical. E.g. show last 10 results of a category. I have edited my question and added an architecture illustration. What would you recommend to look into. Reliability and speed are my main concerns. – merlin Aug 07 '15 at 20:51
  • @merlin Product recommendation questions are off topic here. If that's what you are seeking, then your question should be closed as such. – user Aug 07 '15 at 21:03
  • Like I said before I am wondering if one server is enough to handle that load with the described architecture or if it would be better to spread on two nodes. – merlin Aug 07 '15 at 21:07