vincentcox / bypass-firewalls-by-DNS-history

Firewall bypass script based on DNS history records. This script will search for DNS A history records and check if the server replies for that domain. Handy for bugbounty hunters.
MIT License
1.2k stars 262 forks source link

Use python instead #2

Closed slayerlab closed 3 years ago

slayerlab commented 5 years ago

Python is installed by default on most machine, but you use jq(1) that is requires to be install. In the light of that, why do not use python -m json.tool instead?

vincentcox commented 5 years ago

Good remark! How exactly should I replace jq with that? Below is a line where I use jq:

curl -s https://certspotter.com/api/v0/certs?domain=$domain | jq -c '.[].dns_names'

It would become something like this:

curl -s https://certspotter.com/api/v0/certs?domain=vincentcox.com | python -m json.toolr

But how do I filter on .[].dns_names as shown in the first codeblock?

Thanks in advance.

slayerlab commented 5 years ago

You can do it in the following way:

  1. Python one-liner:

$ curl -s https://certspotter.com/api/v0/certs?domain=vincentcox.com | python -c 'import sys, json; jf=json.load(sys.stdin); print json.dumps(jf[0]["dns_names"])'

The output will be not indented

``` ["*.5735.pw", "5735.pw", "*.5927.pw", "5927.pw", "*.caucrazocam.tk", "caucrazocam.tk", "*.crover.com.sa", "crover.com.sa", "*.derdextzane.ml", "derdextzane.ml", "*.e-ebookdi.ml", "e-ebookdi.ml", "*.flucettiemiss.ga", "flucettiemiss.ga", "*.fsdholdings.com", "fsdholdings.com", "*.georgehottub.win", "georgehottub.win", "*.icaroli.ro", "icaroli.ro", "*.inbookpdf.tk", "inbookpdf.tk", "*.innovation-workshop.ro", "innovation-workshop.ro", "*.kb103.com", "kb103.com", "*.kb280.com", "kb280.com", "*.kb309.com", "kb309.com", "*.kb317.com", "kb317.com", "*.kb329.com", "kb329.com", "*.kb347.com", "kb347.com", "*.kb364.com", "kb364.com", "*.kb381.com", "kb381.com", "*.kb396.com", "kb396.com", "*.kb412.com", "kb412.com", "*.kb435.com", "kb435.com", "*.kb464.com", "kb464.com", "*.kb472.com", "kb472.com", "*.kb540.com", "kb540.com", "*.lensmenreviews.com", "lensmenreviews.com", "*.movienpic.ml", "movienpic.ml", "*.mtg.ai", "mtg.ai", "*.painasickjust.ga", "painasickjust.ga", "*.qfreadd.cf", "qfreadd.cf", "*.sagymreviews-s.ml", "sagymreviews-s.ml", "*.saolann.com", "saolann.com", "*.sislog.es", "sislog.es", "sni227043.cloudflaressl.com", "*.sprecordamage.cf", "sprecordamage.cf", "*.survivalcraft.nl", "survivalcraft.nl", "*.vemotive.net", "vemotive.net", "*.vincentcox.com", "vincentcox.com", "*.womensformaljacketdresses.cf", "womensformaljacketdresses.cf", "*.worshipc.ml", "worshipc.ml", "*.wushu-zentrum.ch", "wushu-zentrum.ch", "*.yayaqu.bid", "yayaqu.bid", "*.yourdailydish.website", "yourdailydish.website", "*.zapo.info", "zapo.info"] ```

To beautify this, just use indent=n, when n is the space for indentation. As you can see, the command below is indented using 4 spaces:

$ curl -s https://certspotter.com/api/v0/certs?domain=vincentcox.com | python -c 'import sys, json; jf=json.load(sys.stdin); print json.dumps(jf[0]["dns_names"], indent=4)'

Output:

``` [ "*.5735.pw", "5735.pw", "*.5927.pw", "5927.pw", "*.caucrazocam.tk", "caucrazocam.tk", "*.crover.com.sa", "crover.com.sa", "*.derdextzane.ml", "derdextzane.ml", "*.e-ebookdi.ml", "e-ebookdi.ml", "*.flucettiemiss.ga", "flucettiemiss.ga", "*.fsdholdings.com", "fsdholdings.com", "*.georgehottub.win", "georgehottub.win", "*.icaroli.ro", "icaroli.ro", "*.inbookpdf.tk", "inbookpdf.tk", "*.innovation-workshop.ro", "innovation-workshop.ro", "*.kb103.com", "kb103.com", "*.kb280.com", "kb280.com", "*.kb309.com", "kb309.com", "*.kb317.com", "kb317.com", "*.kb329.com", "kb329.com", "*.kb347.com", "kb347.com", "*.kb364.com", "kb364.com", "*.kb381.com", "kb381.com", "*.kb396.com", "kb396.com", "*.kb412.com", "kb412.com", "*.kb435.com", "kb435.com", "*.kb464.com", "kb464.com", "*.kb472.com", "kb472.com", "*.kb540.com", "kb540.com", "*.lensmenreviews.com", "lensmenreviews.com", "*.movienpic.ml", "movienpic.ml", "*.mtg.ai", "mtg.ai", "*.painasickjust.ga", "painasickjust.ga", "*.qfreadd.cf", "qfreadd.cf", "*.sagymreviews-s.ml", "sagymreviews-s.ml", "*.saolann.com", "saolann.com", "*.sislog.es", "sislog.es", "sni227043.cloudflaressl.com", "*.sprecordamage.cf", "sprecordamage.cf", "*.survivalcraft.nl", "survivalcraft.nl", "*.vemotive.net", "vemotive.net", "*.vincentcox.com", "vincentcox.com", "*.womensformaljacketdresses.cf", "womensformaljacketdresses.cf", "*.worshipc.ml", "worshipc.ml", "*.wushu-zentrum.ch", "wushu-zentrum.ch", "*.yayaqu.bid", "yayaqu.bid", "*.yourdailydish.website", "yourdailydish.website", "*.zapo.info", "zapo.info" ] ```

  1. Using the previous example in bash is obvious, right? You can invoke a temporary bash script file, pressing Ctrl+X+E, write something like that:
domain="vincentcox.com"

curl -s https://certspotter.com/api/v0/certs?domain=$domain | \
python -c 'import sys, json; \
jf=json.load(sys.stdin); \
print json.dumps(jf[0]["dns_names"], indent=4)'

Save file and Run it – If you're on EMACS, just press Ctrl+X+C and press y. The output will be the same thing as above. Note: I dunno how exit on nano(1), because I don't use it.

vincentcox commented 5 years ago

Great explanation! Thank you for the detailed guide. :+1: I'll test and update the code accordingly this week and push an update. Is it worth to build in a check if python is installed? On Kali it's default installed, on Ubuntu also I think. But I have no experience with other Linux flavors.

slayerlab commented 5 years ago

My pleasure. There are a question on unix.stackexchange about where Python is installed by default.

sjas commented 5 years ago

i'd hint that python2 will be end-of-life by 2020, so you really should go with python3 from the start, which is part of all mainstream distributions anyway.

do you happen to have a solid testcase specified somewhere @vincentcox ?

the code snippet shown doesnt work as expected, comparing the outputs:

CURL

0 sjas@ssg 16:15:47 ~ $1 h1970 c7 
curl -s https://certspotter.com/api/v0/certs?domain=golem.de | jq -c '.[].dns_names'
["jobs.golem.de"]
["profis.golem.de"]
["benchmark.baqend.com","benchmark-news.baqend.com","ct17.baqend.com","einfach-weiter.de","everymillisecondcounts.eu","hafenpick.de","hannesk.de","img.edelsuff.de","io-sandbox.baqend.com","localsonly.town","makefast-asset-dev.speed-kit.com","makefast-dev.speed-kit.com","marieundfelix.wedding","polyglot.systems","sandra-erik.wedding","scdm.cloud","speedkit.golem.de","tc18.baqend.com","test-dev.speed-kit.com","test.wingerath.org","toodle.baqend.com","towelsfortalents.com","twoogle.info","www.dryw.org","www.einfach-weiter.de","www.everymillisecondcounts.eu","www.ezo.ai","www.hafenpick.de","www.hannesk.de","www.localsonly.town","www.marieundfelix.wedding","www.twoogle.info"]
["bbi.baqend.com","benchmark.baqend.com","benchmark-news.baqend.com","ct17.baqend.com","dashboard.baqend.com","einfach-weiter.de","everymillisecondcounts.eu","hafenpick.de","hannesk.de","img.edelsuff.de","io-sandbox.baqend.com","localsonly.town","makefast-asset-dev.speed-kit.com","makefast-dev.speed-kit.com","marieundfelix.wedding","polyglot.systems","sandra-erik.wedding","scdm.cloud","speedkit.golem.de","tc18.baqend.com","test-dev.speed-kit.com","test.wingerath.org","toodle.baqend.com","towelsfortalents.com","twoogle.info","www.dryw.org","www.einfach-weiter.de","www.everymillisecondcounts.eu","www.ezo.ai","www.hafenpick.de","www.hannesk.de","www.localsonly.town","www.marieundfelix.wedding","www.twoogle.info"]
["ssl.1.damoh.golem.de","ssl.2.damoh.golem.de","ssl.3.damoh.golem.de"]
["profis.golem.de"]
["*.golem.de","golem.de"]
["*.golem.de","golem.de"]
["jobs.golem.de"]

0 sjas@ssg 16:15:53 ~ $1 h1971 c8

PYTHON

0 sjas@ssg 16:14:55 ~ $1 h1968 c5 
curl -s https://certspotter.com/api/v0/certs?domain=golem.de | python -c 'import sys, json; jf=json.load(sys.stdin); print json.dumps(jf[0]["dns_names"], indent=4)'
[
    "jobs.golem.de"
]

0 sjas@ssg 16:19:34 ~ $1 h1969 c6 

PYTHON3

0 sjas@ssg 16:19:34 ~ $1 h1969 c6 
curl -s https://certspotter.com/api/v0/certs?domain=golem.de | python3 -c 'import sys, json; jf=json.load(sys.stdin); print(json.dumps(jf[0]["dns_names"], indent=4))'
[
    "jobs.golem.de"
]

0 sjas@ssg 16:20:02 ~ $1 h1970 c7 
vincentcox commented 5 years ago

I think this is caused by jf[0]["dns_names"], which takes only the first element. @SLAYEROWNER, do you have an idea to get all the elements in the array instead of only one? @sjas : good idea to use python3, but I think on most systems python is an alias for python3?

slayerlab commented 5 years ago
  1. Retrieve all "dns_names" from python one-liner: @vincentcox Add a for loop to retrieve all values:
$ curl -s https://certspotter.com/api/v0/certs?domain=golem.de | python -c "import sys, json; jf=json.load(sys.stdin); print ''.join(json.dumps(jf[index]['dns_names'], indent=4) for index,value in enumerate(jf))"
The output will be the same as showed by @sjas, but this is indented by 4 spaces as usual.

``` [ "jobs.golem.de" ][ "profis.golem.de" ][ "benchmark.baqend.com", "benchmark-news.baqend.com", "ct17.baqend.com", "einfach-weiter.de", "everymillisecondcounts.eu", "hafenpick.de", "hannesk.de", "img.edelsuff.de", "io-sandbox.baqend.com", "localsonly.town", "makefast-asset-dev.speed-kit.com", "makefast-dev.speed-kit.com", "marieundfelix.wedding", "polyglot.systems", "sandra-erik.wedding", "scdm.cloud", "speedkit.golem.de", "tc18.baqend.com", "test-dev.speed-kit.com", "test.wingerath.org", "toodle.baqend.com", "towelsfortalents.com", "twoogle.info", "www.dryw.org", "www.einfach-weiter.de", "www.everymillisecondcounts.eu", "www.ezo.ai", "www.hafenpick.de", "www.hannesk.de", "www.localsonly.town", "www.marieundfelix.wedding", "www.twoogle.info" ][ "bbi.baqend.com", "benchmark.baqend.com", "benchmark-news.baqend.com", "ct17.baqend.com", "dashboard.baqend.com", "einfach-weiter.de", "everymillisecondcounts.eu", "hafenpick.de", "hannesk.de", "img.edelsuff.de", "io-sandbox.baqend.com", "localsonly.town", "makefast-asset-dev.speed-kit.com", "makefast-dev.speed-kit.com", "marieundfelix.wedding", "polyglot.systems", "sandra-erik.wedding", "scdm.cloud", "speedkit.golem.de", "tc18.baqend.com", "test-dev.speed-kit.com", "test.wingerath.org", "toodle.baqend.com", "towelsfortalents.com", "twoogle.info", "www.dryw.org", "www.einfach-weiter.de", "www.everymillisecondcounts.eu", "www.ezo.ai", "www.hafenpick.de", "www.hannesk.de", "www.localsonly.town", "www.marieundfelix.wedding", "www.twoogle.info" ][ "ssl.1.damoh.golem.de", "ssl.2.damoh.golem.de", "ssl.3.damoh.golem.de" ][ "profis.golem.de" ][ "*.golem.de", "golem.de" ][ "*.golem.de", "golem.de" ][ "jobs.golem.de" ] ```

  1. EOL for Python2 in 2020: @vincentcox & @sjas Yea! I'm already aware of, and I still keeping ask you to use python because the default python binary will be Python3. Therefore, when you type python on command line, it is likely that you will have a symbolic link to resolve to "python3" binary, just as it is done nowadays to Python2.x. That is, you will already have something configured similar to this: ln -s /usr/bin/python3 /usr/bin/python, in order to minimize certain headaches that some DevOps will have to deal with application deployment for instance.

I don't know what exactly binary's name that will correspond to Python3. So, I believe that using python can prevent some future conflict, because the python3 binary may be named as "python3.4", "python3.7" (python3.x), or even "python4.x" in a far future.

vincentcox commented 5 years ago

Thanks all for the input and discussion. Learned a lot with this. I think I will go for:

curl -s https://certspotter.com/api/v0/certs?domain=golem.de | python -c "import sys, json; jf=json.load(sys.stdin); print ''.join(json.dumps(jf[index]['dns_names'], indent=4) for index,value in enumerate(jf))" | tr -d '"' | tr -d "[" | tr -d "]" | tr -d " " | tr -d "," | sort -u | grep -v "\*" | grep "\.golem.de"

Perhaps this can be with fewer commands, but I am not a Linux wizard :)

The command above will output the following:

jobs.golem.de
profis.golem.de
speedkit.golem.de
ssl.1.damoh.golem.de
ssl.2.damoh.golem.de
ssl.3.damoh.golem.de

So in the code this will become:

curl -s https://certspotter.com/api/v0/certs?domain=$domain | python -c "import sys, json; jf=json.load(sys.stdin); print ''.join(json.dumps(jf[index]['dns_names'], indent=4) for index,value in enumerate(jf))" | tr -d '"' | tr -d "[" | tr -d "]" | tr -d " " | tr -d ","  | sort -u | grep -v "\*" | grep "\.$domain"

Before I replace this in the script I need to test this on my servers and my mac before I push it to the master. Thanks all for this suggestion and input!

slayerlab commented 5 years ago

You're welcome! 👍

vincentcox commented 5 years ago

Hi all, I am trying to implement it for this piece of testing code:

domain=ing.com
curl -N -s https://www.virustotal.com/ui/domains/$domain/subdomains\?limit\= | jq .data[].id | grep -o '"[^"]\+"' | grep "$domain" | sed 's/"//g'

This outputs 10 results. I got some troubles to get the syntax working, but I made it to this:

domain=ing.com
curl -N -s https://www.virustotal.com/ui/domains/$domain/subdomains\?limit\= | python -c "import sys, json; jf=json.load(sys.stdin); print ''.join(json.dumps(jf['data'][index]['id'], indent=4) for index in range(len(jf)+1))" | grep -o '"[^"]\+"' | grep "$domain" | sed 's/"//g'

This will only output 3 results. Anybody knows what I am doing wrong and what it should be? I'll push it out together with a big update with several improvements.

slayerlab commented 5 years ago

@vincentcox : I just seen your question right now.

You need to filter the entries of "where the id is located". I mean: Accessing the entry 0: jf['data'][0]['id']; Accessing the entry 1: jf['data'][1]['id']; [...] and so forth.

Note the bolded character in the example above, the "['data'][0], ['data'][1], ['data'][2], ..." can be accessed using "for index in range(len(jf['data']))" because len(jf['data']) is up to 10 (range starts from 0, so the last value of index is 9.)

Putting all together:

$ curl -N -s https://www.virustotal.com/ui/domains/ing.com/subdomains\?limit\= | python -c "import sys, json; jf=json.load(sys.stdin); print ''.join(json.dumps(jf['data'][index]['id'], indent=4) for index in range(len(jf['data'])))" | grep -o '"[^"]\+"' | grep "$domain" | sed 's/"//g'
www.ing.com
think.ing.com
t.ibicontactlab.bulkmail.ing.com
search.ing.com
research.ing.com
accp.api.ing.com
acceptance.ing.com
acceptance-contr.ing.com
api.ing.com
uk.ing.com

word count line (wc -l): 10

EDIT

The reason why its result is 3, this is because this JSON file -- the content of it -- consists of 3 "root keys": data, links & meta.

data: […]
links: {…}
meta: {…}
vincentcox commented 3 years ago

Hi there, Hopefully in the future I rewrite this into python3! I hope I find some time in the future for this because a lot of stuff happened in the last month. So unfortunately I can't say any ETA. Closing this issue, again thanks for all the input!

slayerlab commented 3 years ago

Hi, @vincentcox! Great. I hope you get the time you need, and everything is alright.