1

Related question: FastCGI and Apache 500 error intermittently

The solution does not work for me.


The problem:

I have a Laravel 5.1 application (was in production on other servers without any problems) running on a fresh Ubuntu 14.04 server with Apache 2.4.7 and PHP through PHP-FPM.

Everything works fine as long as a certain file isn't invoked in the application:

$compiledPath = __DIR__.'/cache/compiled.php';

if (file_exists($compiledPath)) {
    require $compiledPath; // this causes a "500 Internal Server Error"
}

It's a Laravel specific file created automatically by the framework itself to speed things up a little (so it's not a bug in my code), it really exists and I have full access permissions. It's about 600kB in size. When I remove it, everything works fine. But when I tell Laravel to create it again and then hit any route of the application, I get a "500 Internal Server Error" with the following log entries:

[fastcgi:error] [pid 14334] (104)Connection reset by peer: [client xxx.xxx.xxx.xxx:41395] FastCGI: comm with server "/var/www/clients/client1/web1/cgi-bin/php5-fcgi-yyy.yyy.yyy.yyy-80-domain.com" aborted: read failed

[fastcgi:error] [pid 14334] [client xxx.xxx.xxx.xxx:41395] FastCGI: incomplete headers (0 bytes) received from server "/var/www/clients/client1/web1/cgi-bin/php5-fcgi-yyy.yyy.yyy.yyy-80-domain.com"

[fastcgi:error] [pid 14334] (104)Connection reset by peer: [client xxx.xxx.xxx.xxx:41395] FastCGI: comm with server "/var/www/clients/client1/web1/cgi-bin/php5-fcgi-yyy.yyy.yyy.yyy-80-domain.com" aborted: read failed

[fastcgi:error] [pid 14334] [client xxx.xxx.xxx.xxx:41395] FastCGI: incomplete headers (0 bytes) received from server "/var/www/clients/client1/web1/cgi-bin/php5-fcgi-yyy.yyy.yyy.yyy-80-domain.com"

What I've tried:

I tried the solution in the related question mentioned above, which also represents most of the other suggestions concerning this problem I could find: Play around with the common PHP-FPM settings in order to assign more resources. The accepted answer also mentions the option of completely abandoning FastCGI, but I don't want to go there. So I played around with the values, but no luck.

There is no load on the server whatsoever since I'm the only one using it, so I really doubt that it's an issue with the available resources (It's a VPS with 12GB RAM). Could it have something to do with the filesize? It's the only PHP file that big.

I could reproduce the problem on 2 different servers with the same configuration. It did not occur on an Ubuntu 12.04 server with Apache 2.2 with FastCGI.

My current configuration:

PHP-FPM:

pm.max_children = 10
pm.start_servers = 2
pm.min_spare_servers = 1
pm.max_spare_servers = 5
pm.max_requests = 0

<IfModule mod_fastcgi.c>
    ...
    Alias /php5-fcgi /var/www/....
    FastCgiExternalServer /var/www/.... -idle-timeout 300 -socket /var/lib/php5-fpm/web1.sock -pass-header Authorization
</IfModule>

php.ini

memory_limit = 512M
output_buffering = on
Quasdunk
  • 179
  • 1
  • 3
  • 10
  • Are you able to test the script just with `php-cli`? If the big file is deleted, app will regenerate it, and it will fail also on the new file? – Marki555 Jun 15 '15 at 11:41
  • @Marki555 Yes, I tried to delete and regenerate it many times, but in the end in only works if it's not there at all. But now that you've mentioned it - yes, it actually works fine through CLI! I guess this really is a big step towards the root of the problem. From my understanding this narrows down the problem to the php.ini file and the differences between the one for CLI and the 'normal' one, right? Or is there no FastCGI involved in calls from the CLI? (Sorry, I'm just a coder guy, I don't really know much about the inner workings of servers... :-P) – Quasdunk Jun 15 '15 at 11:57
  • FastCGI itself should have no effect. Do you have any accelerator installed like APC for xcache? Maybe it has issues with that file and PHP crashes because of that. – Marki555 Jun 15 '15 at 12:08
  • 1
    @Marki555 Yes, I had Xcache installed! Now it's gone and everything just works! You nailed it and made my day - thank you so much! Switched to APC now, no problems so far. Please post it as an answer so I can accept it! – Quasdunk Jun 15 '15 at 12:28

1 Answers1

2

If PHP is failing only on specific source files, the most probable reason is that some PHP code accelerator (opcode cache) like Xcache, APC or eAccelerator has issues with the file. This can be due to bugs in the accelerator or in PHP itself.

You can try to run your script via PHP command-line interface (php-cli command) as PHP CLI doesn't use any accelerators.

Marki555
  • 1,488
  • 1
  • 14
  • 27