mitchellkrogza / nginx-ultimate-bad-bot-blocker

Nginx Block Bad Bots, Spam Referrer Blocker, Vulnerability Scanners, User-Agents, Malware, Adware, Ransomware, Malicious Sites, with anti-DDOS, Wordpress Theme Detector Blocking and Fail2Ban Jail for Repeat Offenders
Other
4.08k stars 484 forks source link

blacklist-user-agents.conf not working, or bad config? #49

Closed shainegordon closed 7 years ago

shainegordon commented 7 years ago

Firstly, this is an awesome repository, thank you very much

I am trying to completely block Baidu/Baiduspider from my site, however this doesn't seem to work. I am sure I am doing something wrong, but it is not completely obvious to me.

/etc/nginx/bots.d/blacklist-user-agents.conf

"~*Baidu"               3;
"~*Baiduspider"               3;

After restarting nginx

# curl -A "AhrefsBot" https://www.mydomain.com
curl: (52) Empty reply from server
# curl -A "Baiduspider" https://www.mydomain.com
<!doctype html>
<html>
//snipped

looking at my access logs, I can see

180.76.15.154 - - [10/May/2017:11:39:56 +0200] "GET /product/9781441101471 HTTP/1.1" 200 4060 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)"
mitchellkrogza commented 7 years ago

Hi @shainegordon thank for your message, I am indeed working on a way of introducing a different way for people to over-ride the bots they don't want. Strangely it works perfectly on my Apache bad bot blocker but not on Nginx, think I just need to re-order where the blacklist-user-agents.conf file is included. Unfortunately I just lost my father on Friday morning after a 5 month long battle with cancer so I'm not doing any updates right now as I have so many arrangements to take care of but if you can bear with me I assure you as soon as I get my head straight I will figure out a solution for you. Kind Regards, Mitchell

shainegordon commented 7 years ago

My condolences. In the meantime, I have directly updated "globalblacklist.conf" and set the existing "Baidu" rule to "3".

Our update process is manual, so I have made sure our bash script bolds an important message to update this value

mitchellkrogza commented 7 years ago

Thanks @shainegordon tough times but that is life. I am sure what I have planned in my head will sort out the over-riding abilities for users allowing them to be able to use the auto update feature and not have to ever fiddle with modifying the globalblacklist.conf file but for now you have it sorted so hopefully by next week I will push out a new update that sorts it for good. So glad you are enjoying what the blocker can do for your sites. I have seen amazing growth in my sites over the past year not only in traffic, but also better SEO results on Google and my monthly Adsense revenue which I run on a few sites has doubled and continues to grow showing just how much damage all this spam and bad referrer traffic actually causes.

mitchellkrogza commented 7 years ago

@shainegordon fixed in https://github.com/mitchellkrogza/nginx-ultimate-bad-bot-blocker/commit/8e26c5f2d4ab8268f670e7e1747b062bb593f8d3

Simply changed the position of the include file blacklist-user-agents.conf which now loads first before all the whitelisted User-Agents so they load first and take precedence. So please download latest globalblacklist.conf, add your bots like Baidu into your blacklist-user-agents.conf file and reload nginx and you will see they are now blocked. Simple fix which I hoped it would be :) Enjoy 👍

mitchellkrogza commented 7 years ago

@shainegordon have you tried the latest update? Please let me know if it now resolves your issue.

shainegordon commented 7 years ago

thanks @mitchellkrogza, this works perfectly