devanshbatham / ParamSpider

Mining URLs from dark corners of Web Archives for bug hunting/fuzzing/further probing
MIT License
2.53k stars 427 forks source link

Fix Continuation Problem #126

Open nitish800 opened 5 months ago

nitish800 commented 5 months ago

When i use paramspider -L domains.txt. and for some reason if there was a error fetching a URL even after 3rd retry, the code exit and stop scanning for other URL.

This commit allows to skip the url which have error fetching after 3rd retry, and move on to other URL on the list, for me its work.

knrredhelmet commented 2 months ago

you can check out this tool https://github.com/knrredhelmet/paraminer