5

When enabling <Proxy ... enablereuse=on max=10> I start receiving strange responses. When refreshing the current page, the main request loads different responses; like a blank page, responses intended for a separate client, or a 404 response from a CSS file on the requested page.

Removing enablereuse, fixes the strange responses, but prevents concurrent requests from the same user, meaning each request is served individually.

For example: opening two browser tabs to two different URLs on the same vhost domain, if the first requested page takes 5 seconds to load, it will not load the second tab until the first has completed.

I am trying to prevent this, by allowing the same client to perform multiple requests simultaneously, in a concurrent non-blocking manner.

Server Environment

CentOS 6.10 x64
php 5.6.37 Remi
Apache 2.4.33 IUS

MPM Event configuration

<IfModule mpm_event_module>
    ServerLimit              100
    StartServers             4
    ThreadLimit              64
    MaxRequestWorkers        100
    MinSpareThreads          25
    MaxSpareThreads          75
    ThreadsPerChild          25
    MaxConnectionsPerChild   1000
    ListenBacklog       511
</IfModule>

Virtual Host Config (1 of 4 - all identical except IP address, UDS and ServerName)

<VirtualHost 192.168.1.71:443>
    ServerName example.com:443
    DocumentRoot /home/example/example.com
    <IfModule mod_ssl.c>
        SSLEngine on
        SSLCertificateFile /etc/httpd/ssl/certs/example.crt
        SSLCertificateKeyFile /etc/httpd/ssl/private/example.key
        SSLCertificateChainfile /etc/httpd/ssl/certs/example.ca-bundle
        <IfModule mod_setenvif.c>
            SetEnvIf Authorization "(.*)" HTTP_AUTHORIZATION=$1
        </IfModule>
        <IfModule mod_headers.c>
            Header always set Strict-Transport-Security "max-age=63072000; includeSubDomains"
        </IfModule>
    </IfModule>
    <Directory "/home/example/example.com">
        AllowOverride All
        Require all granted
    </Directory>
    <IfModule mod_proxy_fcgi.c>
        <FilesMatch \.php$>
            <If "-f %{REQUEST_FILENAME}">
                SetHandler "proxy:unix:/var/run/example.sock|fcgi://127.0.0.1/"
            </If>
        </FilesMatch>
        <Proxy "fcgi://127.0.0.1" enablereuse=on max=10>
            ProxySet timeout=7200
        </Proxy>
    </IfModule>
</VirtualHost>

PHP-FPM pool Config (1 of 4 all identical except UDS)

[example_com]
user = example
group = example
listen = /var/run/example.sock
listen.owner = example
listen.group = apache
listen.mode = 0660

pm = dynamic
pm.max_children = 20
pm.start_servers = 2
pm.min_spare_servers = 2
pm.max_spare_servers = 20
pm.max_requests = 1000

security.limit_extensions = .php

I have tried using a TCP proxy as opposed to UDS due to other posts commenting on issues with UDS not being supported, but the issue persists:

<IfModule mod_proxy_fcgi.c>
    <FilesMatch \.php$>
        <If "-f %{REQUEST_FILENAME}">
            SetHandler "proxy:fcgi://127.0.0.1:9000/"
        </If>
    </FilesMatch>
    <Proxy "fcgi://127.0.0.1:9000" enablereuse=on max=10>
        ProxySet timeout=7200
    </Proxy>
</IfModule>

Also tried changing the PHP-FPM config with a pm set to dynamic, ondemand and static with appropriate process changes.


I determined the restriction of concurrent requests was due to PHP sessions and a lock that is imposed on filesystem based sessions. However the issue does not correspond with the strange responses I received.

Will B.
  • 73
  • 8

1 Answers1

1

From documentation Apache 2.4: Enable connection reuse to a FCGI backend like PHP-FPM

Please keep in mind that PHP-FPM (at the time of writing, February 2018) uses a prefork model, namely each of its worker processes can handle one connection at the time. By default mod_proxy (configured with enablereuse=on) allows a connection pool of ThreadsPerChild connections to the backend for each httpd process when using a threaded mpm (like worker or event), so the following use cases should be taken into account:

Under HTTP/1.1 load it will likely cause the creation of up to MaxRequestWorkers connections to the FCGI backend.
Under HTTP/2 load, due to how mod_http2 is implemented, there are additional h2 worker threads that may force the creation of other backend connections. The overall count of connections in the pools may raise to more than MaxRequestWorkers.

The maximum number of PHP-FPM worker processes needs to be configured wisely, since there is the chance that they will all end up "busy" handling idle persistent connections, without any room for new ones to be established, and the end user experience will be a pile of HTTP request timeouts.

So my mention not use enablereuse with mod_proxy_fcgi + php-fpm.

Ivan Gurzhiy
  • 306
  • 1
  • 1
  • Thanks for the response. My understanding of the Apache concerns, is that the server could run out of resources if `MaxRequestWorkers` and the PHP-FPM worker processes are not configured appropriately. Thereby blocking requests from being received, as the workers would all be "busy" since they had become idle, due to their being persistent. This should be able to be resolved by setting a low proxy timeout or adjusting the PHP-FPM `process_idle_timeout` `enablereuse` should be able to be used, with an appropriate configuration which is what I am looking to resolve. – Will B. Nov 09 '18 at 15:42