1

I'm using apache2 with mod_fastcgi to run PHP on a private shared server. I have combined this with suexec so I can run each virtual host as it's own UNIX user, keeping Wordpress owners happy.

One site occasionally gets a wave of traffic due to scrapers on Twitter, which generates 10+ php-cgi processes using 50MB each. This generates a lot of OOM errors on one of my 512MB slave server.

I would quite like to limit each user to 4 PHP processes, whilst keeping the global maxmimum processes used to a limit of 8. In the configuration file that loads mod_fastcgi.so, I have defined the following

FastCgiConfig -maxClassProcesses 4 -maxProcesses 8

In my cgi-bin/php.fcgi file for each virtual host I have also set the following:

#!/bin/bash
### Set PATH ###
PHP_CGI=/usr/bin/php-cgi
PHP_FCGI_CHILDREN=0
PHP_FCGI_MAX_REQUESTS=1000
### no editing below ###
export PHP_FCGI_CHILDREN
export PHP_FCGI_MAX_REQUESTS
exec $PHP_CGI

Unfortunately, this hasn't done a lot for my environment and when load testing the servers, I still find more than 4 php-cgi processes for one user running, resulting in more OOM issues. What am I doing wrong here?

MrNorm
  • 153
  • 9

0 Answers0