Building my web service onto EC2 right now and have a single instance behind a load balancer. I will of course cater for multiple instances.
My initial idea was to run all the instances a dumb slaves, and use S3 as local storage. For this, I've begun using S3FS but it's not really ready, from what I've seen, for production use in a web serving environment. Writing of logs seems to appear very late, if not never. Numerous issues with odd caching, even with no cache flags etc. Just generally nightmare to develop on.
But, the alternatives look few. One is obviously EBS volumes, which can be attached to a single instance. Some solutions to sharing this:-
- SMB sharing to other instances. Having one master and the rest slaves - obv needs redundancy built in here with multiple EBS volumes perhaps?
- Rsync sharing to other boxes. This seems painful, considering its not persistent and will updated periodically. Potentially ok, if there are forcing scripts to update when major changes have happened.
Question is... what DO people DO? It seems an entirely common use case, but the variety of answers found in forums and even here on SF, seems to suggest there's not a concise answer... help wanted!