Closed ghost closed 4 years ago
Hi @CuriousIncident,
Thank you for creating this issue, would be great if you can add more specific point or examples, what we know currently shuffledns
stores the data in the temp
directory if you don't specify the directory to use for processing using -directory
flag, we need to add cleanup the temp
directory of the files created by the shuffledns
after completing the scan, also after this merge https://github.com/projectdiscovery/shuffledns/commit/320ed83fdec49ab6002d1fddd6ec4f53e99d23b3 we don't process the RAW format of the massdns output, so make sure to update it using go get -u
for the testing, we keep running shuffledns
frequently with 1M
wordlist with an average time of 1 minute
without any issues with space on a 100G
box.
I tried several runs again and checked my version, it appears that I had an forgotten instance running on a different screen for a night, this was the cause for filling up the vps. If you like I would close the issue.
Hi @CuriousIncident,
Thank you for confirming this, please do not close it as we might add something to make sure we are cleaning up everything from the /tmp
directory after completing the scan.
and confirmed we clean up all the files after a successful run so I will go ahead and close this issue, thank you.
When hunting on larger companies, I end up easily with over 10 million possible subs, since shuffledns stores the output of massdns in raw format. There is no way to run it on an 300 gig VPS or over a longer amount of time. Maybe something to be improved or fixed in the future.