mitchellkrogza / apache-ultimate-bad-bot-blocker

Apache Block Bad Bots, (Referer) Spam Referrer Blocker, Vulnerability Scanners, Malware, Adware, Ransomware, Malicious Sites, Wordpress Theme Detectors and Fail2Ban Jail for Repeat Offenders
Other
775 stars 173 forks source link

[BUG] "Double" 403 errors #142

Open s22-tech opened 4 years ago

s22-tech commented 4 years ago

Describe the bug Every time the Apache 2.4 blocker is activated, two 403's are generated in the error-log

To Reproduce I can reproduce it myself by using the test commands, e.g.: curl --head https://www.domain.com --referer semalt.com

Expected behavior I would expect to get only one 403 per IP, but it also gives one for the 403.shtml file

Server (please complete the following information):

Additional information The 403.shtml file exists and is reachable, so I don't think that error should be in the logs: "client denied by server configuration: /home/user/public_html/403.shtml"

s22-tech commented 4 years ago

I found either a solution or a workaround, I'm not sure which. I added the following block:

<LocationMatch ^(/php-fpm)?/errors/>
    Require         all granted
    DirectoryIndex  index.php
</LocationMatch>

just after:

# ######################################
# GLOBAL! deny bad bots and IP addresses
# ######################################
#
# Should be set after <VirtualHost>s see https://httpd.apache.org/docs/2.4/sections.html#merging
<Location "/">
    # AND - combine with preceding configuration sections.
    AuthMerging And
    # Include black list.
    Include custom.d/globalblacklist.conf
</Location>

and put my error files in that directory. That stopped the double 403 entries in the log.

I'm still not sure why this happened in the first place, but at least my error_log is half the size it was before.

Great script, by the way! I just hope I can solve this problem correctly.

s22-tech commented 4 years ago

Well, there's another problem. robots.txt is also blocked by this and, of course, it shouldn't be. I'm beginning to think I may have installed this incorrectly.

Is there a way to open up and allow pages like robots.txt and 403.shtml to be allowed to everyone? That way, bots who would obey the robots.txt directives wouldn't keep trying to hit the site and clog up the error_log.

Thanks.

mitchellkrogza commented 4 years ago

Must be something else wrong with your setup. The blocker won't block robots.txt. I run this on both Apache and Nginx servers and the robots.txt handles the initial instruction to a bot, thereafter those that disobey robots.txt get caught by the blocker. Double log entries is also nothing to do with the blocker, that would simply be an error or duplication in your apache config.

s22-tech commented 4 years ago

I added the following section to Apache's Post VirtualHost Include, since we were told not to mess with httpd.conf directly:

# ######################################
# GLOBAL! deny bad bots and IP addresses
# ######################################
#
# Should be set after <VirtualHost>s see https://httpd.apache.org/docs/2.4/sections.html#merging
<Location "/">
    # AND-combine with preceding configuration sections
    AuthMerging And
    # include black list
    Include custom.d/globalblacklist.conf
</Location>

Is that the correct file to add this to? Is there something else that needs to be done as well?

Thanks.

mitchellkrogza commented 4 years ago

Need to see the httpd.conf and vhost config's to be able to see whats going on.

s22-tech commented 4 years ago

Sorry - not sure what you mean by "vhost config". Are those the extra files that cPanel adds to httpd.conf - like post_virtualhost_global.conf?

Also, do you want me to post them here or send them via email?

s22-tech commented 4 years ago

In the interest of saving time, I'll post what I have here. If it's not what you're asking for, let me know.

There's only one "extra" conf file that's populated - post_virtualhost_global.conf. There are no files in the listed IncludeOptional paths.

IncludeOptional /usr/local/apache/conf/sharedssl/*.conf
IncludeOptional /usr/local/apache/conf/sharedurl/*.conf

# Drop the Range header when more than 5 ranges.
# CVE-2011-3192
SetEnvIf Range (,.*?){5,} bad-range=1
RequestHeader unset Range env=bad-range

# Optional logging.
CustomLog logs/range-CVE-2011-3192.log common env=bad-range

# ######################################
# GLOBAL! deny bad bots and IP addresses
# ######################################
#
# Should be set after <VirtualHost>s see https://httpd.apache.org/docs/2.4/sections.html#merging

<Location "/">
    # AND - combine with preceding configuration sections.
    AuthMerging And
    # Include black list.
    Include custom.d/globalblacklist.conf
</Location>

<LocationMatch ^(/php-fpm)?/errors/>
    Require         all granted
    DirectoryIndex  index.php
</LocationMatch>

# Global robots.txt file for controlling crawlers.
<LocationMatch ^(/php-fpm)?/robots.txt>
    Require         all granted
    ProxyPass !
</LocationMatch>
#Alias /robots.txt /var/www/html/robots.txt
Alias /robots.txt /home/username/public_html/errors/robots_custom.txt
s22-tech commented 4 years ago

I also see that useragents aren't being blocked. Have you had a chance to see what's wrong with this cPanel setup?