Closed GoonTools closed 6 months ago
@GoonTools @shelld3v I tried to contribute fixing this issue but it seems to be working or already fixed
@ajcriado I don't remember fixing this, and I also don't see the fix in the code as well. Probably you should try the --delay
flag with a high value to test
@shelld3v @GoonTools I tried the provided command and it was working as expected, delaying the requests by 1 second.
In the following video that I recorded you can check how without delay flag the fuzzing is processed at 50 requests per second (average). Specifying the delay flag (set thread to 1 too, to be simpler), the fuzzing is processed at 1 request per second.
https://github.com/maurosoria/dirsearch/assets/20457637/8220963b-7efe-4e63-a287-c3c4de0bdcb0
And using a custom wordlist (with known resources) we can see it clearer
@ajcriado the thing @GoonTools talked about here is the "baseline requests". Basically, before brute-forcing, dirsearch sends some requests to test the server behavior on different path patterns. To see those stuff happening in the background, you can run dirsearch with a proxy server (Burp Suite?)
@shelld3v okey, noted! Let me fix the issue and I'll let you know
What is the current behavior?
When using the
--delay
flag to avoid rate limits, the initial HTTP requests that are sent to establish what the baseline "not found" responses should be (one for each supplied extension) do not respect the delay and are sent in quick succession. I have to proxydirsearch
through BurpSuite, intercept the requests, and manually click continue on the first few requests to prevent being rate limited. After that it works fine.What is the expected behavior?
All HTTP requests sent should respect the
--delay
flag, even the ones intended to tune detection.Any additional information?
Command:
dirsearch -u https://REDACTED/ -e php,txt,html,md,xml,bk --delay=1 -t 1
Notice the timestamps:![image](https://github.com/maurosoria/dirsearch/assets/43768871/583c34d2-69e7-481f-adaa-0f1731e7b21b)