Closed RogueThread closed 1 year ago
Either you use blacklist off which means bots also will crawl your link or you go and remove 127.0.0.1(localhost) from blacklist list.
Of course i’d prefer to use the blacklist to prevent bots crawling but the issue is still that 127.0.0.1 gets added to Blacklist after visits from external IPs (see timestamps in screenshot)
Of course i’d prefer to use the blacklist to prevent bots crawling but the issue is still that 127.0.0.1 gets added to Blacklist after visits from external IPs (see timestamps in screenshot)
This is still an issue afaik, shouldn't the blacklist use the last set X-Forwared-For IP? at least that would make sense in my case, would be nice to have an option to use that
Hi,
With
blacklist unauth
setting: I get a visit from external attacker then 127.0.0.1 gets added to blacklist and then phishing page no longer works and every IP is "unauthorized", external visits are considered as "127.0.0.1"See screenshot: