Closed C0nw0nk closed 7 years ago
short solution: try first to limit the request rate by IP, and you can do that using ngx_http_limit_req_module
That was already done and in place but there are many IP's it may aswell be thousands of IP's shell shock probing but they are not probing for SQL exploits or anything to hack the server with. They are just intentionally bypassing the caches to DoS the server down. (Slowloris like)
http {
limit_req_zone $binary_remote_addr zone=one:10m rate=1r/s;
limit_conn_zone $binary_remote_addr zone=addr:10m;
server {
location / {
limit_req zone=one burst=5;
limit_conn addr 1;
}
}
}
It is why some naxsi rules to prevent these fake and junk data requests would be useful.
As you see from the User-Agent I provided that is a obviously spoofed one that Naxsi could use a rule to detect spoofed / faked user agents that are too short or all numbers etc.
It also appears extensions on browsers like this do not help situations either.
https://addons.mozilla.org/en-GB/firefox/addon/random-agent-spoofer/?src=cb-dl-users
Inserting fake data into urls and spoofing headers etc.
I am not seeking to block spoofed headers that could be a real valid header or URL but when i receive requests like this that blatantly obviously garbage data and spoofed. These should be stopped.
Blatantly obvious a problematic request with intentions to bypass caches etc.
User-Agent : 12345
URL : /inDeX.php?Random=FAKEData&vars=123&vars2=456&etc=more-fake_garbage
You can write a rule that match on user-agent I guess, something like mainRule "mz:$HEADER:User-agent" "rx:[0-9\"]" "s:$DOS" "id:1337";
Thanks but wouldn't a rule like that be blocking any user-agent that contains numbers like for example a valid legit user-agent is as follows.
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
It needs to be a strict match for numbers or exact matching only i feel.
As for query strings thats just a nightmare really.
?rnd
?ran
?rand
?random
?1
?2
?3
?4
?abc
?efg
etc
etc
Hello, I think having a challenge (ie. js) will be more efficient than trying to solve it with naxsi, as you might run into a endless cat/mouse game :) something like https://github.com/kyprizel/testcookie-nginx-module might help !
It is already a cat and mouse game :(
Do you know if javascript challenge pages like this, The same as Cloudflare's Anti DDoS | IUAM (I'm Under Attack Mode) javascript challenge page coincidently. If these pages can allow search engine crawlers through I don't know how well Google, Bing, Baidu, DuckDuckGo etc these search engine crawlers and bots may not understand or solve javascript challenges ?
You can of course whitelist the search engines, since their ranges are public (or by user-agent, but it won't take long for the attacker to detect this I guess)
Yeah i would never whitelist the user-agent since it is easy for them to spoof and fake that.
The IP's I have no idea if there is a existing list anyone can share or is aware of that lists all legitimate search engines crawlers IPs to whitelist.
So I received allot of annoying requests to my servers lately that NAXSI definitely could solve.
The urls contain strings like this to bypass caches and flood / dos back end processes.
(switching between upper and lower case)
(inserting random junk data to bypass caches)
User Agent used
Other User agents used they increment the number by +1 each time etc.
What are some good rules to prevent this.