haccer / subjack

Subdomain Takeover tool written in Go
Apache License 2.0
1.92k stars 338 forks source link

Large subdomain list: Error: too many open files #17

Open PjMpire opened 6 years ago

PjMpire commented 6 years ago

Im enumerating through a high number of subdomains from a list and when getting around halfway down the list, I get a message "too many open files " and the enumeration stops

image

image

haccer commented 6 years ago

Hi @HellboundTKR,

I've ran into this problem in the past. To help you mitigate this issue I have a few questions,

Thanks

haccer commented 6 years ago

Also, I would like to suggest running: cat all_subdomains.lst | sort | uniq > new_all_subdomains.lst to remove duplicate subdomains (From the screenshot it looks like you have duplicates).

haccer commented 6 years ago

Okay, another update.

In an old, old version of subjack, when I was using net/http instead of fasthttp, I had a Connection: close header set which mitigated this issue in the past.

I just pushed an update to readd this header to each request.

Please retest and confirm this is a working solution after you've:

Thanks

PjMpire commented 6 years ago

Hi,

The llst is around 100k subdomains. My OS is ParrotOS 64 ulimit - n was 1024

changed ulimit to unlimited but message still occuring even after patch

thanks

haccer commented 6 years ago

@HellboundTKR that's very strange if it's still occurring after you set it to unlimited in the same session.

I'm attempting to reproduce this issue with a list containing ~246k subdomains.

➜  subjack git:(master) ✗ ulimit -n
4864

So far gone through, over 50k of those subdomains without any errors using the following command, similar to the one you posted above:

./subjack -w cname_list.txt -t 50 -o subjackresults.txt -ssl

Do you have an estimate of how many subdomains you're able to enumerate before you experience the 'too many files' error?

haccer commented 6 years ago

Just an update, I surpassed 100k subdomains w/o any errors

haccer commented 6 years ago

Perhaps this is a low memory / low cpu issue? I ran into the "too many files" a long time ago on my 1gb ram ubuntu box... I'm doing this current scan with the 246k subdomain list on my macbook pro. Are you running this on Parrot OS in a VM? With low memory/cpu?

A possible workaround would be to split the large file into chunks.

AnotherWayIn commented 6 years ago

can confirm it chugs through my list of 750k domains quite happily on a Kali VM with 2gb ram

PjMpire commented 6 years ago

Yeh its strange, I get to about 80k domains when it stops.

Im using it in a Virtualbox VM with 4cores set 4gb memory using i7 4770k @4,5ghz

Thing is, I can chug through my list no problems when using SubOver tool.

haccer commented 6 years ago

Subjack does make a lot more connections and requests than SubOver (which is based on an older version of Subjack) to accurately check for a possible subdomain takeover.

I’ll set up a Parrot OS VM this weekend, then run a series of tests including testing with the default Parrot OS instalization and testing with my suggested performance optimizations.

If anyone reading this issue is experiencing the same problem, please comment the OS, wordlist size, memory and CPU details and I will try to replicate that as well. Thanks.

HeisenbugHQ commented 6 years ago

Same here, but I don't think is a VM problem. You can easily fix it adjusting -t flag. As per 90mb's fiber I'm using -t 150 but if I tune it up to 200 -> too many open files. IS there any other way to fix it? How may I speed up it?

haccer commented 6 years ago

@HeisenbugHQ well, 200 threads is a lot.... I don't recommend going past 100, it's important to keep in mind that the more you increase, the harder your machine is going to have to work.

--

@everyone The underlying issue here is that the 'too many open files' error occurs when there's too many connections open.

I've done as much research as I could, the only solution for the 'too many open files' error is to raise the ulimit (ulimit -n unlimited).

Taking all of this into consideration, the only solutions to remediate this issue are:

hdbreaker commented 5 years ago

Hi guys! I having the same issue with a list of 14000 host

1) ulimit is set to unlimited 2) my box have 4 gb ram 2 cores 50 ssd 3) This happen with 30 threads

marcelo321 commented 4 years ago

I don't know why this happens. i had no errors with a list of +1 millions subdomains at 30-50 threads. but today i tried again at 80 threads with a list of 100k subdomains and got this error.

I will try again with no threads at all and update this comment.

EDIT: Apparently if you just lower the threads you will run it just fine. play with it until you have no errors.

zeknox commented 4 years ago

I ran into a similar but different issue while leveraging this tool. The error I ran into was specifically around fingerprints.json: too many open files. I've created a PR which seems to address this issue and may address other folks' issues as well.

https://github.com/haccer/subjack/pull/49

abi1915 commented 4 years ago

same error too many files to open. Any updates now.

Phoenix1112 commented 4 years ago

same error too many files to open. Any updates now.

Sen türkmüsün?