mitchellkrogza / apache-ultimate-bad-bot-blocker

Apache Block Bad Bots, (Referer) Spam Referrer Blocker, Vulnerability Scanners, Malware, Adware, Ransomware, Malicious Sites, Wordpress Theme Detectors and Fail2Ban Jail for Repeat Offenders
Other
806 stars 175 forks source link

Bot blocker not working for Apache/2.4.29 (Ubuntu) #139

Open contactr2m opened 4 years ago

contactr2m commented 4 years ago

Server version: Apache/2.4.29 (Ubuntu) I tried following your given steps for Apache2.4 but when i try to test my site using curl, it seems bot blocker is not working.

Curl shows 301 may be because i have http redirect to https ?

curl -A "80legs" http://example.com
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>301 Moved Permanently</title>
</head><body>
<h1>Moved Permanently</h1>
<p>The document has moved <a href="https://example.com/">here</a>.</p>
<hr>
<address>Apache/2.4.29 (Ubuntu) Server at example.com Port 80</address>

But i i use curl -A "80legs" https://example.com then entire page is loaded, instead of 403 ?

/etc/apache2# cat sites-enabled/000-default.conf

UseCanonicalName On

<VirtualHost *:80>
        ServerAdmin webmaster@localhost

        ServerName example.com
        ServerAlias www.example.com

        DocumentRoot /var/www/html

        <Directory /var/www/html/>
            Options FollowSymLinks
            AllowOverride All
            # Include /etc/apache2/custom.d/globalblacklist.conf
            Include custom.d/globalblacklist.conf
            Require all denied
        </Directory>

        ErrorLog ${APACHE_LOG_DIR}/error.log
        CustomLog ${APACHE_LOG_DIR}/access.log combined
RewriteEngine on
RewriteCond %{SERVER_NAME} =example.com [OR]
RewriteCond %{SERVER_NAME} =www.example.com
RewriteRule ^ https://%{SERVER_NAME}%{REQUEST_URI} [END,NE,R=permanent]
</VirtualHost>

sites-enabled/000-default-le-ssl.conf

<IfModule mod_ssl.c>
<VirtualHost *:443>
        ServerAdmin webmaster@localhost

        ServerName example.com
        ServerAlias www.example.com

        DocumentRoot /var/www/html

        <Directory /var/www/html/>
            Options FollowSymLinks
            AllowOverride All
            # Include /etc/apache2/custom.d/globalblacklist.conf
            Include custom.d/globalblacklist.conf
            Require all denied
        </Directory>

        ErrorLog ${APACHE_LOG_DIR}/error.log
        CustomLog ${APACHE_LOG_DIR}/access.log combined

Include /etc/letsencrypt/options-ssl-apache.conf
SSLCertificateFile /etc/letsencrypt/live/example.com/fullchain.pem
SSLCertificateKeyFile /etc/letsencrypt/live/example.com/privkey.pem
</VirtualHost>
</IfModule>
ZerooCool commented 4 years ago

I have the same, with : Require all denied

If i add Include custom.d/globalblacklist.conf

At the begin or at the end end of <Directory /var/www/server-ip> Require all denied

Then, my 403 doesn't work and i can see index.html

My Vhost : https://wiki.visionduweb.fr/index.php?title=VirtualHosts_des_domaines_enregistr%C3%A9s#139.99.173.195_.C3.A9coute_du_port_HTTP_80

VirtualHosts des domaines enregistrés — Analyse Développement Hacking
cafn commented 4 years ago

No response. I too am having the same problems. I just installed, actually updated to the newest version as of this date, and I can not get the test responses to reply as they are supposed to. My responses:

curl -A "googlebot" https://mydomain.com Should respond with 200 OK, but instead I get no response on http://, https:// loads the page.

curl -A "80legs" https://mydomain.com curl -A "masscan" https://mydomain.com Should respond with 403 Forbidden, but instead I get no response on http://, https:// loads the page.

curl -I https://mydomain.com -e http://100dollars-seo.com curl -I https://mydomain.com -e http://zx6.ru Should respond with 403 Forbidden, but I get "HTTP/2 200" back on both http:// and https:// . The connection is made.

Granted the URL is forwarded to https:// so I did not expect to get a reply back there, but I did expect to get the expected results when using https:// also. This did not happen. I installed the previous version about 18 months ago and it worked fine when installed, but I had not checked it since. I just re-downloaded the new files as per the instructions and left the vhost include statement as it was. This does not appear to be working, for me and I would really like for it to work. I have read the instructions a couple of additional times and made sure all files were in place.

Anybody got any ideas?

ZerooCool commented 4 years ago

To my log, i have the same for AspiegelBot Should respond with 403 Forbidden (Or 302 redirect?), but I get "200".

curl -A "80legs" https://www.visionduweb.fr
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>302 Found</title>
</head><body>
<h1>Found</h1>
<p>The document has moved <a href="https://www.visionduweb.fr/403-forbidden.php">here</a>.</p>
</body></html>

curl -I https://www.visionduweb.fr -e http://100dollars-seo.com
HTTP/1.1 302 Found
Date: Tue, 05 May 2020 17:04:17 GMT
Server: Apache
Strict-Transport-Security: max-age=63072000; includeSubDomains; preload
X-Frame-Options: SAMEORIGIN
Referrer-Policy: no-referrer-when-downgrade
Feature-Policy: geolocation none;midi none;notifications none;push none;sync-xhr self;microphone none;camera none;magnetometer none;gyroscope none;speaker self;vibrate none;fullscreen self;payment none;
Location: https://www.visionduweb.fr/403-forbidden.php
Cache-Control: max-age=604800
Expires: Tue, 12 May 2020 17:04:17 GMT
Content-Type: text/html; charset=iso-8859-1

curl -A "AspiegelBot" https://www.visionduweb.fr
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>302 Found</title>
</head><body>
<h1>Found</h1>
<p>The document has moved <a href="https://www.visionduweb.fr/403-forbidden.php">here</a>.</p>
</body></html>

curl -A "AspiegelBot" -L https://www.visionduweb.fr
<!doctype html>
<html lang="fr">
<head>
<meta charset="utf-8"/>
</head>
<body>
<h1 style="text-align:center;">403 - Forbidden</h1>
<h2 style="text-align:center;">L'administrateur du serveur n'autorise pas la consultation de cette page.</h2>
<p style="text-align:center;"><img src="/images/structure/403-forbidden.jpg"/></p>
<p>Si vous pensez qu'il s'agit d'une erreur, merci de contacter l'administrateur en lui signifiant l'adresse URL de provenance.</p>
<p>Merci de transmettre des informations sur cette erreur, ainsi que la date et l'heure.</p>
<p>Le mail de l'administrateur : mail@visionduweb.com</p>
<p><a href="https://www.visionduweb.fr">Retour sur la page d'accueil</a></p>
</body>
</html>

I have installed Bad Bot Blocker with this method : https://wiki.visionduweb.fr/index.php?title=Configurer_le_fichier_.htaccess#Bloquer_des_Bots_et_des_URL_ind.C3.A9sirables_avec_Bad_Bot_Blocker

My VirtualHost : https://wiki.visionduweb.fr/index.php?title=VirtualHosts_des_domaines_enregistr%C3%A9s

ctlui commented 4 years ago

I was getting same of you, so I try with v2.2 instructions and it's ok now. About "AspiegelBot", he's not on the globalblacklist.conf as "zx6.ru"

ZerooCool commented 4 years ago

I have add AspiegelBot for my conf ;)

Where is the v2.2 instructions ? But, we have Apache 2.4.

For my last test, for AspiegelBot, i use the -L option for curl, then i can make the same with 80legs : Strange, if i use curl -A .. -L .. the redirection work good.

The curl option -L is the good answer for you and me.

curl -A "80legs" -L https://www.visionduweb.fr
<!doctype html>
<html lang="fr">
<head>
<meta charset="utf-8"/>
</head>
<body>

<h1 style="text-align:center;">403 - Forbidden</h1>
<h2 style="text-align:center;">L'administrateur du serveur n'autorise pas la consultation de cette page.</h2>

<p style="text-align:center;"><img src="/images/structure/403-forbidden.jpg"/></p>

<p>Si vous pensez qu'il s'agit d'une erreur, merci de contacter l'administrateur en lui signifiant l'adresse URL de provenance.</p>
<p>Merci de transmettre des informations sur cette erreur, ainsi que la date et l'heure.</p>
<p>Le mail de l'administrateur : mail@visionduweb.com</p>

<p><a href="https://www.visionduweb.fr">Retour sur la page d'accueil</a></p>
</body>
</html>
dchmelik commented 4 years ago

Not just an Ubuntu problem, but general Apache 2.4 problem; at least on other POSIX-based OSs (such as stable/up-to-date Slackware GNU/Linux 14.2 w/Apache 2.4.43the same problem: setup instructions' tests for bad bots result in Apache serving pages fine (with normal 200 code.)

mitchellkrogza commented 4 years ago

The blocker does work and is tested across 2.2 up to 2.4.23 - see the build log and tests for yourself https://travis-ci.org/github/mitchellkrogza/apache-ultimate-bad-bot-blocker

If you mess up your Apache 2.4 permission structure in any way at all higher up the chain you break everything below it including the blocker.

Travis CI - Test and Deploy Your Code with Confidence
Travis CI enables your team to test and ship your apps with confidence. Easily sync your projects with Travis CI and you'll be testing your code in minutes.
dchmelik commented 4 years ago

The blocker does work and is tested across 2.2 up to 2.4.23 - see the build log and tests for yourself [...]

Fine, but I stated same problem except w/more server-focused OSes: any such setup/test logs?

If you mess up your Apache 2.4 permission structure [...]

Unclear to me what that means for Apache... can anyone suggest documentation or elaborate how to debug?

mitchellkrogza commented 4 years ago

Start off by comparing your apache2.conf with the version used in tests - https://github.com/mitchellkrogza/apache-ultimate-bad-bot-blocker/blob/master/.dev-tools/_test_results/_conf_files_2.4/apache2.conf

Specifically these blocks

https://github.com/mitchellkrogza/apache-ultimate-bad-bot-blocker/blob/master/.dev-tools/_test_results/_conf_files_2.4/apache2.conf#L159-L180

GitHub
mitchellkrogza/apache-ultimate-bad-bot-blocker
Apache Block Bad Bots, (Referer) Spam Referrer Blocker, Vulnerability Scanners, Malware, Adware, Ransomware, Malicious Sites, Wordpress Theme Detectors and Fail2Ban Jail for Repeat Offenders - mitc...
mitchellkrogza commented 4 years ago

Then also make sure your vhost config follows the same configuration for its main directory block https://github.com/mitchellkrogza/apache-ultimate-bad-bot-blocker/blob/master/.dev-tools/_test_results/_conf_files_2.4/testsite.conf

GitHub
mitchellkrogza/apache-ultimate-bad-bot-blocker
Apache Block Bad Bots, (Referer) Spam Referrer Blocker, Vulnerability Scanners, Malware, Adware, Ransomware, Malicious Sites, Wordpress Theme Detectors and Fail2Ban Jail for Repeat Offenders - mitc...
skonesam commented 2 years ago

The links above don't work, but found the tests anyway. Only (significant?) change I see is RewriteEngine On is in the test but not the instructions. It doesn't seem to change the outcome.

amitash commented 1 year ago

I had the same issue but I was whitelisting my own IP, so everything was allowed. Testing from another machine worked well.