six2dez / reconftw

reconFTW is a tool designed to perform automated recon on a target domain by running the best set of tools to perform scanning and finding out vulnerabilities
MIT License
5.63k stars 912 forks source link

Increase proxy coverage? #492

Closed geeknik closed 2 years ago

geeknik commented 2 years ago

Currently, it only looks like ffuf requests are proxied if "$PROXY" = true. It would make more sense to pass as much as possible through the proxy (especially if using ZAP or Burp, you can passively find more issues). So I'm asking to please add proxy support to the nuclei, httpx and gospider routines. Thank you.

six2dez commented 2 years ago

Hi! Sorry for the delay. Currently you can proxify all the websites discovered on http probing step (httpx output) and also all the urls discovered by passive methods and crawling and fuzzing.

The unique restriction here is for websites and urls, if number of webs or urls discovered exceeds the DEEP_LIMIT_2 config var defined on reconFTW.cfg, it will not proxify, just to avoid overloads on burp suite sending more than 1500 websites (value by default).

Of course you can edit or modify DEEP_LIMIT_2 value.

I can add a condition to proxify all no matter how many webs or urls are in the list.

Tell me what do you think.

Thanks!

six2dez commented 2 years ago

I re-read this issue and want to clarify one thing: reconftw uses ffuf to proxify the websites and urls, it's not proxfying only the sites found by fuzzing, for example here from webprobing function:

image

geeknik commented 2 years ago

Well, I couldn't find any instance of gospider using the -p or --proxy flag. I also don't see where the -proxy or -http-proxy flags are used with httpx. And lastly there is no usage of the -p or -proxy flag with nuclei, so all of those requests are going straight out over the wire and not through a proxy. It's great that ffuf uses the proxy, but that doesn't help the rest of the software running. 👍🏻

six2dez commented 2 years ago

I don't think I made myself clear. Reconftw doesn't use the built-in proxy features on that tools because it does some post processing on the tools results.

For example, on the URLs extraction function, it runs Gau and waybackurls for passive collection and gospider for active collection, after that it merges all the results, removes duplicates, removes entries starting with special characters or not http/s protocol, removes the empty lines and also removes the lines out of scope. After that, it uses ffuf to iterate over the urls and send to proxy. And this is the important part, I use ffuf here only to iterate over the urls and send everything to the proxy, not adding anything.

Same happens for httpx, I collect websites and stores them with all the important info, like title, fingerprints, technologies, etc, then I post process that info to clean oos sites, etc. And at the end of the function, the websites are sent to ffuf just to send to the proxy in a fast way.

six2dez commented 2 years ago

From your request, nuclei findings are the unique that is not sent to proxy, so I will work on this :)