projectdiscovery / dnsx

dnsx is a fast and multi-purpose DNS toolkit allow to run multiple DNS queries of your choice with a list of user-supplied resolvers.
https://docs.projectdiscovery.io/tools/dnsx
MIT License
2.19k stars 245 forks source link

Can't resolve big domain lists #378

Open oldDuDe124 opened 1 year ago

oldDuDe124 commented 1 year ago

Hi I'm trying to resolve a wordlist containing roughly 2.2 million domains , i know some domains must be resolved and are valid but when I pass the list to dnsx it just doesn't seem to be resolving them and finishes the scan after 10 seconds. I've tried using stdin input like: cat list.txt | dnsx -silent and also using -l option: dnsx -l list.txt -silent but no luck! I also tried reducing the list to 1k lines and it worked. so I guess the size of the list is the main problem here.

MetzinAround commented 1 year ago

Does anything happen when you use -stats during a run?

ehsandeep commented 1 year ago

@oldDuDe124 do you see any error that you can share when you run with -v option

NotoriousRebel commented 1 year ago

Same error occurs with tlsx as it does here, when giving it a file with IP ranges. At least 60,000+ IPs in total via a few subnets causes the process to get killed after around 10 seconds of running. Running with -v does not give any errors, but should be easy to replicate as you just need a few subnets in a file. For tlsx: tlsx -c 72 -v -l ip_ranges.txt -ro -san and dnsx: dnsx -t 72 -l ip_ranges.txt -ptr -resp-only May just be caused by too many IP ranges being expanded at once, I notice a huge spike in CPU usage before crashing.

hastalamuerte commented 7 months ago

same , no errors but output just for a bit of list . in dnsx , in httpx with asn flag same

dnsx -l URL.txt -asn -v httpx ...

Mzack9999 commented 2 months ago

memory should not increase in usage as the list is offloaded to disk via leveldb. I'm unable to replicate the crash. Would it be possible to share the target list? (feel free to join our discord and DM it to any pd team members referring this Github Issue or directly to me mzack9999). Thanks!