devanshbatham / ParamSpider

Mining URLs from dark corners of Web Archives for bug hunting/fuzzing/further probing
MIT License
2.34k stars 403 forks source link

Getting this error #97

Closed apprahuman closed 10 months ago

apprahuman commented 10 months ago

[INFO] Fetching URLs for http://testphp.vulnweb.com [INFO] Found 6652 URLs for http://testphp.vulnweb.com [INFO] Cleaning URLs for http://testphp.vulnweb.com [INFO] Found 515 URLs after cleaning [INFO] Extracting URLs with parameters Traceback (most recent call last): File "/usr/local/bin/paramspider", line 8, in sys.exit(main()) File "/usr/local/lib/python3.10/dist-packages/paramspider/main.py", line 161, in main fetch_and_clean_urls(domain, extensions, args.stream, args.proxy, args.placeholder) File "/usr/local/lib/python3.10/dist-packages/paramspider/main.py", line 111, in fetch_and_clean_urls with open(result_file, "w") as f: FileNotFoundError: [Errno 2] No such file or directory: 'results/http://testphp.vulnweb.com.txt'

ExCave commented 10 months ago

how to fix

ramsy0dev commented 10 months ago

@apprahuman I think you shouldn't provide the http:// with the url in the -d flag. According to the README.md it takes just the domain so you should change your command into the following

paramspider -d testphp.vulnweb.com

after do so the output is:

                                      _    __
   ___  ___ ________ ___ _  ___ ___  (_)__/ /__ ____
  / _ \/ _ `/ __/ _ `/  ' \(_-</ _ \/ / _  / -_) __/
 / .__/\_,_/_/  \_,_/_/_/_/___/ .__/_/\_,_/\__/_/
/_/                          /_/

                              with <3 by @0xasm0d3us

[INFO] Fetching URLs for testphp.vulnweb.com
[INFO] Found 6662 URLs for testphp.vulnweb.com
[INFO] Cleaning URLs for testphp.vulnweb.com
[INFO] Found 516 URLs after cleaning
[INFO] Saved cleaned URLs to results/testphp.vulnweb.com.txt

This error is caused because os thinks that the / in http:// is a path and it doesn't seem to exists and so the exception FileNotFoundError gets raised. I think we should add a regex to detect this (http://, https://... and any /) and replace them with an empty string.

apprahuman commented 10 months ago

I think error out when its tries to save the result and creating folder with forwards slashes.. // .. its try to change the folder location, this happened to me when I was making one tool too, so what I did was changed the forward slashes with dashes "-"

filename=$(echo "$url" | sed 's/\//-/g')

devanshbatham commented 10 months ago

-d expects a domain name and not URL.