Due to budget constraints, I'm using the smallest droplet that DigitalOcean provides, which is running a small (4 or 5 users) legacy php application for a client until I can rewrite it for them. I have just installed redis in order to try to boost performance for a few of the more expensive database queries. I have looked at the questions listed at the bottom, and believe that I still need to ask this question. I do not want to do "capacity planning" at this stage. I just want to come up with a so-called "<sane value>" for maxmemory per the accepted answer of the redis question cited below, based on the typical demands of apache httpd 2.2.15, php 5.3.3, and mysql 5.1.73 running on CentOS 6.9 (like I said, legacy).
The output from free -h when no one is using the app is
total used free shared buffers cached
Mem: 1.0G 809M 197M 368K 179M 484M
-/+ buffers/cache: 145M 860M
Swap: 0B 0B 0B
I am concerned by the fact that /etc/php.ini has memory_limit = 256M
and I am alarmed by the output of config get maxmemory
in redis-cli, as it appears that the default setting doesn't bother at all to find out what's actually available on the system.
127.0.0.1:6379> config get maxmemory
1) "maxmemory"
2) "3221225472"
My gut tells me that if I naively set memory_limit and maxmemory both to 98M, I am going to experience grief if the app is running at capacity and suddenly the system decides to fire up some random housekeeping processes, or I need to jump into vi to fix something on the fly (not that I'd ever do that on a production machine). Hmm, and I just now realized I haven't even taken mysql's working memory into consideration.... Note that #551727 discusses tuning strategies, but doesn't actually answer the question concerning how much free memory should be maintained.
Questions researched:
Can you help me with my capacity planning?
How much free memory should I have on my webserver?
https://stackoverflow.com/questions/33115325/how-to-set-redis-max-memory