Closed ehsandeep closed 5 years ago
Also point to note, for the same target it works few times and throws error other time.
Hi, unfortunately that basically means that you are trying to go too fast for your connection, and the file handles for DNS requests pile up. I have a few ideas about how to fix this in future, but at the least I should make the error message more reasonable.
For the time being lowering the number of concurrent requests (-t 40
for example) should fix it for you. As a quick "non-fix" I could make the error message in this situation say something along the lines of:
Looks like your connection cannot keep up with the rate of requests, consider lowering the amount of concurrent requests (-t)
As an additional note, the reasons why you might not be seeing this on every run with the same wordlist, and against the same host could be:
-t
value is too high, but your wordlist is short enough that for some of the runs enough of the file handles get closed before running into the limits.One way to mitigate the issue, would be to change the ulimit
setting, but the issue would still appear when using a longer wordlist.
Thanks for all the suggestion, will lower down the -t
and just for your information, I was running this with the wordlist of 6k
entries on VPS with good bandwidth connection.
I dug in a bit, and found a culprit of it. Fix will be implemented in the next release. Thanks for opening the issue! In my own tests the speed went up from ~4,5k req/sec to ~7k sec, which I believe is the limit of my connection.
Aaand the v0.5 is out. You should now be able to crank the -t
way higher than 50.
Thank you for the quick fix!
Hi @joohoi,
Thank you for working on this, just noticed this error for few times now with random targets.
ffuf -t 50 -fs 0 -k -mc 200 -w word.txt -u https://test.site.com/FUZZ