Closed apprahuman closed 1 year ago
@apprahuman I think you shouldn't provide the http:// with the url in the -d
flag. According to the README.md it takes just the domain so you should change your command into the following
paramspider -d testphp.vulnweb.com
after do so the output is:
_ __
___ ___ ________ ___ _ ___ ___ (_)__/ /__ ____
/ _ \/ _ `/ __/ _ `/ ' \(_-</ _ \/ / _ / -_) __/
/ .__/\_,_/_/ \_,_/_/_/_/___/ .__/_/\_,_/\__/_/
/_/ /_/
with <3 by @0xasm0d3us
[INFO] Fetching URLs for testphp.vulnweb.com
[INFO] Found 6662 URLs for testphp.vulnweb.com
[INFO] Cleaning URLs for testphp.vulnweb.com
[INFO] Found 516 URLs after cleaning
[INFO] Saved cleaned URLs to results/testphp.vulnweb.com.txt
This error is caused because os
thinks that the /
in http://
is a path and it doesn't seem to exists and so the exception FileNotFoundError gets raised. I think we should add a regex to detect this (http://, https://... and any /) and replace them with an empty string.
I think error out when its tries to save the result and creating folder with forwards slashes.. // .. its try to change the folder location, this happened to me when I was making one tool too, so what I did was changed the forward slashes with dashes "-"
filename=$(echo "$url" | sed 's/\//-/g')
-d expects a domain name and not URL.
[INFO] Fetching URLs for http://testphp.vulnweb.com [INFO] Found 6652 URLs for http://testphp.vulnweb.com [INFO] Cleaning URLs for http://testphp.vulnweb.com [INFO] Found 515 URLs after cleaning [INFO] Extracting URLs with parameters Traceback (most recent call last): File "/usr/local/bin/paramspider", line 8, in
sys.exit(main())
File "/usr/local/lib/python3.10/dist-packages/paramspider/main.py", line 161, in main
fetch_and_clean_urls(domain, extensions, args.stream, args.proxy, args.placeholder)
File "/usr/local/lib/python3.10/dist-packages/paramspider/main.py", line 111, in fetch_and_clean_urls
with open(result_file, "w") as f:
FileNotFoundError: [Errno 2] No such file or directory: 'results/http://testphp.vulnweb.com.txt'