blacklanternsecurity / bbot

A recursive internet scanner for hackers.
https://www.blacklanternsecurity.com/bbot/
GNU General Public License v3.0
4.19k stars 381 forks source link

Scan Errors #1159

Closed amiremami closed 4 months ago

amiremami commented 4 months ago

I scanned with BBOT 1.1.7.3042rc0 and got some errors.

  1. I got stdout blocking on events, is it normal? image

  2. Gowitness won't work if the domain has 2 A records in cloudflare. Yesterday I reported this, at that time I thought maybe the issue is in my side, however, I think this is a bug. Please check the files I attached, gowitness didn't run at all. In some cases it runs but not screenshot all links. debug.log output.txt

If I remove one of the A records from cloudflare, then gowitness will run properly.

TheTechromancer commented 4 months ago

Thanks for noticing this, the stdout blocking was a debug statement I forgot to remove, it's fixed in https://github.com/blacklanternsecurity/bbot/pull/1160.

We'll be looking into gowitness.

amiremami commented 4 months ago

Also document this here too. Prevent generating buckets which do not exist

image

TheTechromancer commented 4 months ago

After looking back at my old code I was reminded how azure handles their buckets, and I was able to verify that those buckets actually do exist. You can see the difference with curl -- a non-existent bucket will resolve, while a non-existent one won't:

image

Keep in mind that only means they exist, not that they're open. Azure's buckets don't support listing files, but they are dirbustable via ffuf, etc.

As for the gowitness bug, can you share the output of both scans (with and without the second A record) so I can compare the results? Right now I'm unsure whether the bug is with httpx or gowitness, because the site must go through httpx before it gets to gowitness.

amiremami commented 4 months ago

For example as you can see screenshot of https://bugz.zip/secret.html not exists in "two A records"

image

debug-twoA.log output-twoA.txt debug-oneA.log output-oneA.txt

TheTechromancer commented 4 months ago

URLS detected with one record: https://bugz.zip/ http://bugz.zip/ https://bugz.zip/secret.html https://bugz.zip:2087/ https://bugz.zip:8443/ http://bugz.zip:2086/ https://bugz.zip:2083/ http://bugz.zip:8880/ http://bugz.zip:2082/ https://bugz.zip:2053/ https://www.bugz.zip/ https://www.bugz.zip:8443/ https://www.bugz.zip:2087/ http://www.bugz.zip/ http://www.bugz.zip:8080/ http://www.bugz.zip:2052/ http://www.bugz.zip:2095/ http://www.bugz.zip:8880/ http://www.bugz.zip:2086/ http://www.bugz.zip:2082/ https://www.bugz.zip:2083/

With two: http://bugz.zip:443/ http://bugz.zip:2053/ http://bugz.zip:8443/ http://bugz.zip:2083/ http://bugz.zip:2087/ http://www.bugz.zip:443/ http://www.bugz.zip:2087/ http://www.bugz.zip:8443/ http://www.bugz.zip:2083/

This indicates either a bug in httpx, or a problem with one of the IPs.

@amiremami are you sure both IP addresses are valid and return the same thing? I.e. if you were to curl them both like this:

curl -H 'Host: bugz.zip' https://1.2.3.4/
amiremami commented 4 months ago

I have 2 servers (small & big) with 1 storage, so I switch my storage between them. So, always only 1 of them is powered on. In these two tests, I was working with my small server and bugz.zip pointed to my small server IP. In TWO A records test, I also added the IP of big server which was powered off to my cloudflare.

TheTechromancer commented 4 months ago

Wait, so you're saying one of the IPs was offline?

amiremami commented 4 months ago

Wait, so you're saying one of the IPs was offline?

Yes correct, But I don't understand how this impacts on gowtiness. ☹️ So, if a site adds extra offline IP, gowtiness can't run correctly?

TheTechromancer commented 4 months ago

The issue is with httpx, since all URL events have to go through httpx before getting to gowitness. Httpx will only pick one of the IPs to visit, and it happens that by chance it's picking the inactive one.

This is how DNS resolution is typically handled, even for security tools, because there are many hosts with multiple A records, but 99.9% of the time they are all identical, so it's considered a waste of resources to visit them all.

image

Httpx does have an option to visit all IPs, but it's disabled by default. I'll add an option for it: https://github.com/blacklanternsecurity/bbot/pull/1165.

amiremami commented 4 months ago

Is this ok? @TheTechromancer image

2024-03-13 10:08:58,423 [TRACE] bbot.modules.asn base.py:1366 Traceback (most recent call last):
  File "/root/.local/pipx/venvs/bbot/lib/python3.10/site-packages/bbot/modules/report/asn.py", line 217, in get_url
    j = r.json()
  File "/root/.local/pipx/venvs/bbot/lib/python3.10/site-packages/httpx/_models.py", line 762, in json
    return jsonlib.loads(self.content, **kwargs)
  File "/usr/lib/python3.10/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python3.10/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python3.10/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
TheTechromancer commented 4 months ago

Is this ok?

Yes, freely available asn services are very unreliable but it will automatically fail over to another service.

TheTechromancer commented 4 months ago

@amiremami are we good to close this now?

amiremami commented 4 months ago

@amiremami are we good to close this now?

Yes, thank you so much 🙏