Is this an Addition / Removal Request?
Addition.
Please and thank you!
Please List the User-Agent string or Referrer to be added/removed
example: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/webkit-version (KHTML, like Gecko) Silk/browser-version like Chrome/chrome-version Safari/webkit-version
Bot / User Agent doesn't respect current robots.txt, .htaccess, or even Apache bad bot settings what so ever and winds up toppling servers due to insane amounts of constant requests.
We had to block the offending IP at the firewall level.
For Additions: Please include a log sample 3-5 lines is adequate
I've sanitized the institution domain but you get the drift. The IP is real. We're running Apache in a Docker container which works really well. Big fans of your software.
The requests first come through to a Traefik Docker container (reverse-proxy) which then forwards the request to the Apache container running Bad Bot. Typically blocking by user-agent does the trick asap but not this time it appears.
Despite adding the following to the blacklist-user-agents.conf, 3/4 of the requests were coming through which is also odd in that some requests were blocked to the site homepage but the more elaborate requests pushed through?
Is this an Addition / Removal Request? Addition. Please and thank you!
Please List the User-Agent string or Referrer to be added/removed example:
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/webkit-version (KHTML, like Gecko) Silk/browser-version like Chrome/chrome-version Safari/webkit-version
Mozilla/5.0 (compatible; Seekport Crawler; http://seekport.com/)"
Please explain why it should be added/removed
Bot / User Agent doesn't respect current
robots.txt
,.htaccess
, or even Apache bad bot settings what so ever and winds up toppling servers due to insane amounts of constant requests.We had to block the offending IP at the firewall level.
For Additions: Please include a log sample 3-5 lines is adequate
I've sanitized the institution domain but you get the drift. The IP is real. We're running Apache in a Docker container which works really well. Big fans of your software.
The requests first come through to a Traefik Docker container (reverse-proxy) which then forwards the request to the Apache container running Bad Bot. Typically blocking by
user-agent
does the trick asap but not this time it appears.Any other important information to consider
Despite adding the following to the
blacklist-user-agents.conf
, 3/4 of the requests were coming through which is also odd in that some requests were blocked to the site homepage but the more elaborate requests pushed through?