4

I have PHP Laravel application installed on my CentOS VPS. It is backend for my mobile application that has been recently updated. Unfortunately I wrote some bad code that results in making request that is concatenation of hundreds, maybe thousands of words.

My apache server is going down very often ( at minimum 1 per hour ) and I have to run : service httpd restart.

As i see in error.log there are a lot of those entries :

(36)File name too long: Cannot map GET /adminpanel/public/api/v2/categoriese=c3Jr...    

Tens of those requests per second and apache is down.

I've published an update to app stores, but some users still have old version. Is there a possibility to block those requests before they will be processed ?

2 Answers2

1

Apache has built-in default limit of 8190 for LimitRequestLine Directive, which regulates maximum length of HTTP GET request. As mentioned, see if altering this parameter does your job.

Regarding apache going down every hour or more often: I would be surprised that this is related to rejecting too long GET request. Apache is carefully crafted to drop irregular requests with minimum effort, precisely in order to avoid denial of service and resource exhaustion.

From what you posted, this sounds like it may be related to type of Multi-Processing Modules (MPMs) you are using, relation of minimum/maximum forked processes and available system resources and of course, processing power your successful request are using on the back-end (PHP).

Miloš Đakonović
  • 640
  • 3
  • 9
  • 28
0

As a temporary workaround, you could set the LimitRequestLine parameter to something long.

Mark R.
  • 365
  • 1
  • 5
  • I bumped the LimitRequestLine all the way up to 255989, but am still seeing these errors in the Apache log... The connecting IP is the CloudFlare's and the protocol is HTTP/2.0. – Mikhail T. Feb 07 '22 at 19:26