initstring / cloud_enum

Multi-cloud OSINT tool. Enumerate public resources in AWS, Azure, and Google Cloud.
MIT License
1.54k stars 224 forks source link

Breaks when too many fuzz list is given #36

Closed MR-pentestGuy closed 2 years ago

MR-pentestGuy commented 3 years ago

First of all very nice tool i have used commonspeak subdomain wordlist to check it but it kinda breaks when tested Mutations: /home/sanath/tools/juicy/cloud_enum/enum_tools/fuzz.txt Brute-list: /home/sanath/tools/juicy/cloud_enum/enum_tools/fuzz.txt

[+] Mutations list imported: 484943 items [+] Mutated results: 2909659 items

++++++++++++++++++++++++++ amazon checks ++++++++++++++++++++++++++

[+] Checking for S3 buckets The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/lib/python3.8/encodings/idna.py", line 165, in encode raise UnicodeError("label empty or too long") UnicodeError: label empty or too long

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "cloud_enum.py", line 243, in main() File "cloud_enum.py", line 228, in main aws_checks.run_all(names, args) File "/home/sanath/tools/juicy/cloud_enum/enum_tools/aws_checks.py", line 130, in run_all check_s3_buckets(names, args.threads) File "/home/sanath/tools/juicy/cloud_enum/enum_tools/aws_checks.py", line 84, in check_s3_buckets utils.get_url_batch(candidates, use_ssl=False, File "/home/sanath/tools/juicy/cloud_enum/enum_tools/utils.py", line 81, in get_url_batch batch_results[url] = batch_pending[url].result(timeout=30) File "/usr/lib/python3.8/concurrent/futures/_base.py", line 432, in result return self.get_result() File "/usr/lib/python3.8/concurrent/futures/_base.py", line 388, in get_result raise self._exception File "/usr/lib/python3.8/concurrent/futures/thread.py", line 57, in run result = self.fn(*self.args, self.kwargs) File "/home/sanath/.local/lib/python3.8/site-packages/requests/sessions.py", line 533, in request resp = self.send(prep, send_kwargs) File "/home/sanath/.local/lib/python3.8/site-packages/requests/sessions.py", line 646, in send r = adapter.send(request, kwargs) File "/home/sanath/.local/lib/python3.8/site-packages/requests/adapters.py", line 439, in send resp = conn.urlopen( File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 665, in urlopen httplib_response = self._make_request( File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 387, in _make_request conn.request(method, url, httplib_request_kw) File "/usr/lib/python3.8/http/client.py", line 1255, in request self._send_request(method, url, body, headers, encode_chunked) File "/usr/lib/python3.8/http/client.py", line 1301, in _send_request self.endheaders(body, encode_chunked=encode_chunked) File "/usr/lib/python3.8/http/client.py", line 1250, in endheaders self._send_output(message_body, encode_chunked=encode_chunked) File "/usr/lib/python3.8/http/client.py", line 1010, in _send_output self.send(msg) File "/usr/lib/python3.8/http/client.py", line 950, in send self.connect() File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/connection.py", line 184, in connect conn = self._new_conn() File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/connection.py", line 156, in _new_conn conn = connection.create_connection( File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/util/connection.py", line 61, in create_connection for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): File "/usr/lib/python3.8/socket.py", line 918, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags):

initstring commented 3 years ago

Hi @MR-pentestGuy ,

Thanks for opening an issue! I'm not sure that this has to do with the length of the wordlist, but maybe.

Thanks!

MR-pentestGuy commented 3 years ago
  1. yes , the tool works so good with the stock fuzz list
  2. i am using wsl Ubuntu 20.04
  3. the command i am using is the same cloudenum.py -k keyword -t 10
  4. No, i have added commonspeak list in the fuzz.txt
initstring commented 3 years ago

Thanks for that info!

Can you please provide me a link to the exact word list (commonspeak) you are talking about? I'm not familiar with that, and I want to make sure to test with the exact same file.

MR-pentestGuy commented 3 years ago

https://github.com/assetnote/commonspeak2-wordlists go to subdomains

MR-pentestGuy commented 3 years ago

one more thing i have reduced the number of wordlist to 448 (before it was around 480081)took only specific words only in the fuzz.txt

initstring commented 3 years ago

Thanks! It's taking a bit too long to test with the full list. :) Does your shorter list also produce the error? If so, can you share the shorter list with me?

sri-sanath commented 3 years ago

The shorter list around 200 I have taken from the common2speak wordlist and by the way didn't it break while your running maybe your having higher gibs I guess

initstring commented 3 years ago

@sri-sanath Sorry - are you also the original reporter of this issue answering from a different account? I'm not exactly sure what you mean.

I don't believe the original issue has anything to do with the length of the wordlist. I think there are some particular strings in the list that are breaking it, so I need the smallest possible list that causes the error to reproduce.

Thanks.

sri-sanath commented 3 years ago

Yes I am using different account

sri-sanath commented 3 years ago

You could take around 1000 list as lowest as possible if your using more than 1000 it breaks I have tested it

initstring commented 3 years ago

OK, please provide an exact copy of a list that breaks it, with the smallest amount of words possible. You can upload the list as an attachment to your comment. Thanks!

MR-pentestGuy commented 3 years ago

Sorry about using two accounts here are link to those list take the first 1000 or 1100 https://github.com/koaj/aws-s3-bucket-wordlist

MR-pentestGuy commented 3 years ago

and one more thing while using Kl list of keywords if your keywords list is more than wordlist this also breaks i have been hacking top level company and took keywords by enumerating javascript so those are around 3K

initstring commented 3 years ago

I cannot reproduce this issue. I have a hunch it's something to do with WSL, but I could definitely be wrong. Unfortunately, I don't have an environment with Windows to try to reproduce this there. I am able to use quite large wordlists with no problem.

In terms of the issue your reported, I tried running the tool with -kf and provided a keyword file that had more lines that the fuzz.txt file. Again, I encountered no error and the tool worked fine.

Sorry, I wish I had a better answer for you but I'm a bit stumped. If you figure it out, please let me know so I can work on a patch. Or if you have any more info to share that might provide more insight let me know.

MR-pentestGuy commented 3 years ago

Hey dont be sorry i think its wsl it sucks sometimes i will try more to get more info on the too and keep up the good work and thank you for the tool @initstring

initstring commented 2 years ago

It's been a while on this issue with no one else chiming in with a solution. I hope you've found a manageable work-around. But feel free to reply if you have any more questions. Thanks!