1

I'm seeing lots of "client denied by server configuration" errors in the log on one of my servers, e.g.,

[Sun Mar 11 14:47:27.600091 2018] [:error] [pid 15375] [client 146.52.126.142:55685] script '/home/example/www/wp-login.php' not found or unable to stat
[Sun Mar 11 14:49:05.022447 2018] [authz_core:error] [pid 13727] [client 137.226.113.26:55086] AH01630: client denied by server configuration: /home/example/www/
[Sun Mar 11 14:58:22.853323 2018] [authz_core:error] [pid 14437] [client 163.172.226.46:58423] AH01630: client denied by server configuration: /home/example/www/downloader
[Sun Mar 11 14:58:59.747029 2018] [authz_core:error] [pid 13770] [client 163.172.226.46:50464] AH01630: client denied by server configuration: /home/example/www/downloader
[Sun Mar 11 14:58:59.812363 2018] [authz_core:error] [pid 16432] [client 163.172.226.46:56776] AH01630: client denied by server configuration: /home/example/www/downloader
[Sun Mar 11 14:59:00.599941 2018] [authz_core:error] [pid 15228] [client 207.46.13.65:11653] AH01630: client denied by server configuration: /home/example/www/

At the same time, in the access log, I see other clients apparently were served without a problem at effectively the same time (the 200 response code vs. the 403 code):

146.52.126.142 - - [11/Mar/2018:14:47:27 -0400] "GET /wp-login.php HTTP/1.1" 404 4062 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:40.0) Gecko/20100101 Firefox/40.1"
146.52.126.142 - - [11/Mar/2018:14:47:27 -0400] "GET / HTTP/1.1" 200 27838 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:40.0) Gecko/20100101 Firefox/40.1"
137.226.113.26 - - [11/Mar/2018:14:49:05 -0400] "GET / HTTP/1.1" 403 209 "-" "Mozilla/5.0 zgrab/0.x (compatible; Researchscan/t13rl; +http://researchscan.comsys.rwth-aachen.de)"
163.172.226.46 - - [11/Mar/2018:14:58:22 -0400] "GET /downloader/ HTTP/1.1" 403 220 "-" "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:56.0) Gecko/20100101 Firefox/56.0"
163.172.226.46 - - [11/Mar/2018:14:58:59 -0400] "GET /downloader/ HTTP/1.1" 403 220 "-" "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:56.0) Gecko/20100101 Firefox/56.0"
163.172.226.46 - - [11/Mar/2018:14:58:59 -0400] "GET /downloader/ HTTP/1.1" 403 220 "-" "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:56.0) Gecko/20100101 Firefox/56.0"
207.46.13.65 - - [11/Mar/2018:14:59:00 -0400] "GET / HTTP/1.1" 403 209 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"

(The ones that really concern me are, e.g., bingbot.htm and the IP addresses belonging to Google - if the search engines are having a hard time fetching pages from my site, I've got a big problem.)

I've got no idea why the "client denied" errors are occurring because I never have any problem going to my site: I can't reproduce the failure.

I tried adding a <Location> directive as suggested in the accepted answer at authz_core keeps denying access, but (as I expected) it didn't make any difference: The server configuration is correct, as nearly as I can tell.

Here is the configuration for the server:

# redirect all HTTP (port 80) requests to the HTTPS server
<VirtualHost *:80>
    ServerName secure.example.com
    Redirect permanent / https://secure.example.com/
</VirtualHost>
<VirtualHost *:80>
    ServerName www.example.com
    Redirect permanent / https://www.example.com/
</VirtualHost>
<VirtualHost *:80>
    ServerName example.com
    ServerAlias *.example.com
    Redirect permanent / https://example.com/
</VirtualHost>

# HTTPS server configuration
<VirtualHost *:443>
    ServerName secure.example.com
    ServerAlias example.com www.example.com
    ServerAdmin webmaster@example.com
    SSLEngine on
    SSLCipherSuite ALL:!ADH:!EXPORT56:RC4+RSA:+HIGH:+MEDIUM:+LOW:+SSLv2:+EXP:+eNULL
    SSLCertificateFile /path/to/example.com/fullchain.pem
    SSLCertificateKeyFile /path/to/live/example.com/privkey.pem
    DocumentRoot "/home/example/www"
    <Directory "/home/example/www">
        Options All MultiViews
        AllowOverride All
        Require all granted
    </Directory>
    <Location "/">
        Require all granted
    </Location>
    LogLevel debug
    ErrorDocument 404 /cgibin/badurl.php
    ErrorLog "|/usr/local/sbin/rotatelogs /home/example/logs/error_log 86400"
    CustomLog "|/usr/local/sbin/rotatelogs /home/example/logs/access_log 86400" combined
    CustomLog "|/usr/local/sbin/rotatelogs /home/example/logs/ssl_request_log 86400" ssl_request
    BrowserMatch "MSIE [2-5]" ssl-unclean-shutdown nokeepalive downgrade-1.0 force-response-1.0
    BrowserMatch "MSIE [6-9]" ssl-unclean-shutdown
    <Files ~ "\.(cgi|shtml|phtml|php4|php|pl)$">
        SSLOptions +StdEnvVars
    </Files>
    <Directory "/home/example/cgibin">
        SSLOptions +StdEnvVars
    </Directory>
    <IfModule mod_alias.c>
        ScriptAlias /cgibin/ "/home/example/cgibin/"
        <Directory "/home/example/cgibin">
            AllowOverride None
            Options FollowSymlinks
            Require all granted
        </Directory>
    </IfModule>
</VirtualHost>

There are no .htaccess files in the directory, so I don't know what's causing this problem, and even with the LogLevel set to debug, all I get in the error log is the "client denied" message.

Are these errors malignant clients attempting to use my server as a proxy? If so, how do I get them to stop pounding on it? ... or is there something else going wrong?

Is there any way to get more debugging information written to the log so I can understand why the failures are happening?

Server version: Apache/2.4.29 (FreeBSD)
Server built:   unknown
FreeBSD Dreamer 11.1-RELEASE-p4 FreeBSD 11.1-RELEASE-p4 #0: Tue Nov 14 06:12:40 UTC 2017     root@amd64-builder.daemonology.net:/usr/obj/usr/src/sys/GENERIC  amd64
Michael Hampton
  • 237,123
  • 42
  • 477
  • 940
FKEinternet
  • 291
  • 2
  • 4
  • 10

0 Answers0