6

I've recently switched to a FastCGI setup for PHP (Apache2-worker and mod_fcgid). However, when a single PHP script is very busy, it seems to block all other PHP requests. What would be wrong with my configuration?

My main reason for using mod_fcgid is to keep PHP memory usage under control. With mod_php, all individual Apache forks grow in memory after serving PHP.

I've also switched to the apache2-worker model, since there all thread-unsafe PHP code exists outside Apache.

My FastCGI script looks like:

#!/bin/sh
#export PHPRC=/etc/php/fastcgi/
export PHP_FCGI_CHILDREN=5
export PHP_FCGI_MAX_REQUESTS=5000

global_root=/srv/www/vhosts.d/
exec /usr/bin/php-cgi5 \
-d open_basedir=$global_root:/tmp:/usr/share/php5:/var/lib/php5 \
-d disable_functions="exec,shell_exec,system"

My Apache config looks like this:

<IfModule fcgid_module>
  FcgidIPCDir /var/lib/apache2/fcgid/
  FcgidProcessTableFile /var/lib/apache2/fcgid/shm
  FcgidMaxProcessesPerClass 1
  FcgidInitialEnv RAILS_ENV production
  FcgidIOTimeout 600
  AddHandler fcgid-script .fcgi

  FcgidConnectTimeout 20
  MaxRequestLen 16777216

  <FilesMatch "\.php$">
    AddHandler fcgid-script .php
    Options +ExecCGI
    FcgidWrapper /srv/www/cgi-bin/php5-wrapper.sh .php
  </FilesMatch>
  DirectoryIndex index.php
</IfModule>
vdboor
  • 3,630
  • 3
  • 30
  • 32

2 Answers2

3

Found the answer at: https://stackoverflow.com/questions/598444/how-to-share-apc-cache-between-several-php-processes-when-running-under-fastcgi/1094068#1094068

The problem isn't PHP, but mod_fcgid. While PHP spawns multiple children, mod_fcgid is ignorant of it, and will serve one request per child. Hence, when FcgidMaxProcessesPerClass 1 is used, all PHP execution happens after each other. *

The solution presented at which links to: http://www.brandonturner.net/blog/2009/07/fastcgi_with_php_opcode_cache/ explains how to use mod_fastcgi which doesn't have this limitation. It will send multiple requests to the same child.

[*] Note that not using FcgidMaxProcessesPerClass 1 results in many separate instances of PHP, ruby, etc.. while they are all capable of processing many requests internally in a single process.


Hence a new Apache config to use PHP with fastcgi:

<IfModule mod_fastcgi.c>

    # Needed for for suEXEC: FastCgiWrapper On
    FastCgiConfig -idle-timeout 20 -maxClassProcesses 1 -initial-env RAILS_ENV=production
    FastCgiIpcDir /var/lib/apache2/fastcgi

    AddHandler php5-fcgi .php
    Action php5-fcgi /.fcgi-bin/php5-wrapper.sh
    DirectoryIndex index.php

    ScriptAlias /.fcgi-bin/ /srv/www/cgi-bin/
    <Location "/.fcgi-bin/php5-wrapper.sh">
        Order Deny,Allow
        Deny from All
        #Allow from all
        Allow from env=REDIRECT_STATUS
        Options ExecCGI
        SetHandler fastcgi-script
    </Location>

    # Startup PHP directly
    FastCgiServer /srv/www/cgi-bin/php5-wrapper.sh

    # Support dynamic startup
    AddHandler fastcgi-script fcg fcgi fpl
</IfModule>
vdboor
  • 3,630
  • 3
  • 30
  • 32
1

First, your wrapper script and setup is Just Plan Wrong in a bad way, unless Apache's docs are out of date. Read the "Special PHP Considerations" in the mod_fcgid docs and use the script and example settings there. Your current setup will basically spawn a bunch of unusable php child processes, then every 5001st PHP request will error out since PHP will exit after the 5000th request, but you are missing the FcgidMaxRequestsPerProcess 5000 directive that tells mod_fcgid that it will need to start a new PHP process after 5000 requests.

As for simultaneous PHP processes, each simultaneous request requires it's own PHP process, so you will need to increase your FcgidMaxProcessesPerClass directive to a higher number.

DerfK
  • 19,313
  • 2
  • 35
  • 51
  • When I kill the php-cgi5 process, it's just recreated, so that doesn't seem to happen. Currently PHP manages it's children, instead of mod_fcgid doing that. I'll still try out your suggestions, see how that works! – vdboor Aug 23 '11 at 07:30