Stratus-Security / Subdominator

The Internets #1 Subdomain Takeover Tool
https://www.stratussecurity.com
MIT License
191 stars 15 forks source link

Documentation is very lite. No output after running command #2

Closed edgibbs104 closed 5 months ago

edgibbs104 commented 6 months ago

I have a list of subdomains that I wanted to check in a single column csv text file on Windows. I downloaded Subdominator.exe on my Windows 11 machine.

dev> .\Subdominator.exe -l subs.csv 0/0 domains processed. Average rate: 0.00 domains/sec Done in 1.65E-05s

dev> $PSVersionTable

Name Value


PSVersion 5.1.22621.2428 PSEdition Desktop PSCompatibleVersions {1.0, 2.0, 3.0, 4.0...} BuildVersion 10.0.22621.2428 CLRVersion 4.0.30319.42000 WSManStackVersion 3.0 PSRemotingProtocolVersion 2.3 SerializationVersion 1.1.0.1

Possible problem?

coj337 commented 6 months ago

That's very weird, it seems that it didn't notice any valid domains in your input file. Do you have a copy (or subset) you could share so I can debug? If you use the verbose flag (-v) it will tell you if it's detecting any domains as invalid too

edgibbs104 commented 6 months ago

I just ran it against a list of CNN subdomains

It's not clear what services you're comparing against, do you have a list? It may very well be that the subdomains I chose for a particular root domain don't have any of the services on your list. Does it only look for cloud, SaaS, and CDN services?

cnn-full.csv

subs>Subdominator.exe -l cnn-full.csv [Fastly] search.arabic.cnn.com - CNAME: www.cnnarabic.com., cnn-tls.map.fastly.net. [Fastly] dev.client.appletv.cnn.com - CNAME: cnn-tls.map.fastly.net. [Fastly] client.appletv.cnn.com - CNAME: cnn-tls.map.fastly.net. [Fastly] fave.edition.cnn.com - A: 151.101.131.5, 151.101.67.5, 151.101.195.5, 151.101.3.5 [Fastly] api.etp.cnn.com - CNAME: cnn-tls.map.fastly.net. [Fastly] qa.money.cnn.com - CNAME: cnn-tls.map.fastly.net. 5084/8980 domains processed. Average rate: 84.73 domains/sec [Fastly] preview.train.money.cnn.com - CNAME: cnn-tls.map.fastly.net. [Unbounce] get.collection.cnn.com - CNAME: unbouncepages.com. [Fastly] at.cnn.com - A: 151.101.195.5, 151.101.131.5, 151.101.67.5, 151.101.3.5 [Fastly] train.money.cnn.com - CNAME: cnn-tls.map.fastly.net. [Fastly] tour.cnn.com - CNAME: tours.cnn.com., cnn-tls.map.fastly.net. [Fastly] weather.edition.cnn.com - A: 151.101.195.5, 151.101.131.5, 151.101.67.5, 151.101.3.5 [Fastly] preview.qa.money.cnn.com - CNAME: cnn-tls.map.fastly.net. [Fastly] cnne-prod.cnn.com - CNAME: cnn-tls.map.fastly.net. [Fastly] mw.cnn.com - CNAME: m.cnn.com., lite.cnn.com., cnn-tls.map.fastly.net. 8980/8980 domains processed. Average rate: 84.11 domains/sec Done in 106.7666613s

coj337 commented 6 months ago

You're right, it should be clear what services are checked without going on an adventure through the code. I have added a list to the readme here along with a little information about how they're created: https://github.com/Stratus-Security/Subdominator#fingerprints

As far as I'm aware (and there was a good amount of validation here), the tool will test every service included in other tools plus some extras. If it doesn't print anything, it's likely that there aren't any vulnerable services.

In saying that, it should never say 0/0 domains processed unless it's given an empty list but it seems like it's working properly on your CNN list. Hope that helps! If you're still having any issues let me know 😄

dreyerrd commented 6 months ago

@coj337 I'm excited to support this solution; great job so far!

I agree there is not enough documentation on what an expected output should look like for a vulnerable site versus one which is not, and it's not clear from your demo what to expect. When running the tool with a simple test (Subdominator.exe -l subdomains.txt), the expected output is different than the output when written to a file.

During my test, I selected 34 domains. When using the simple command 1 result is returned in my CLI, but when output to a file there are 34 domains listed, and there is no indication if a given domain is vulnerable to takeover (or why, i.e. which service check(s) failed).

Let me know if you need additional information.

coj337 commented 6 months ago

Thanks for the feedback @dreyerrd! I have updated the readme with output explanations and added a bit to the final print to indicate how many vulnerable domains were found for clarity. It also removes any invalid domains at the start but only reported it when verbose, I have now made it show that by default so any invalid (skipped) domains are indicated to the user to avoid confusion.

As for your output file, it should only have the 1 result as indicated in the CLI unless you're using the verbose flag. Can you update to the latest release and if it's still happening, share the input file? Note that it appends to your output file so you may need to delete it if it already exists.

edgibbs104 commented 6 months ago

I have a DNS data dump of hundreds of millions of recent CNAMES with the structure and examples provided below. Would it be possible, with some code modification perhaps, to walk the CSV file and look for possible takeovers?

Structure:

coj337 commented 5 months ago

@edgibbs104 Rather than supporting the millions of formats out there, it's probably best to just extract them locally.

Here's a quick and dirty one-liner to extract from the third column in bash:

cat yourfile.csv | cut -d',' -f3 > extracted_domains.txt

Another for Powershell:

Get-Content yourfile.csv | ForEach-Object { ($_ -split ',')[2] } | Out-File extracted_domains.txt
edgibbs104 commented 5 months ago

Unfortunately, the file is now 600GB and would take a considerable amount of time to parse. Would it be possible to add an -c x option that would apply in your code, which might benefit other users of it as well, where -c x is the column in the CSV file to read - similar to the way csvkit - csvcut and csvgrep work? If not, I'll just have to cut the column and run with it.

coj337 commented 5 months ago

Alright, I'm convinced! @edgibbs104 I could see this being a common use case for anyone that wants to do large scale crawling/analysis, but I do have concerns about the memory usage since it reads every domain in at the beginning (but not the whole file so it depends on how much is actually domains!).

I just dropped v1.65 which supports the -c and --csv flag, the flag accepts a column number or heading. Let me know how you go and I'll look at how we can solve any problems that come up from your /thicc/ file.