A virtual host scanner that performs reverse lookups, can be used with pivot tools, detect catch-all scenarios, work around wildcards, aliases and dynamic default pages.
I've resolved one bug with this but it doesn't appear to work against our test bench when tested with Windows. Stack trace below:
Traceback (most recent call last): File "VHostScan.py", line 114, in <module> main() File "VHostScan.py", line 109, in main output.output_json(arguments.output_json) File "C:\Users\codingo\VHostScan\lib\helpers\output_helper.py", line 59, in output_json file.write_file(json.dumps(list)) File "C:\Python34\lib\json\__init__.py", line 230, in dumps return _default_encoder.encode(obj) File "C:\Python34\lib\json\encoder.py", line 192, in encode chunks = self.iterencode(o, _one_shot=True) File "C:\Python34\lib\json\encoder.py", line 250, in iterencode return _iterencode(o, 0) File "C:\Python34\lib\json\encoder.py", line 173, in default raise TypeError(repr(o) + " is not JSON serializable") TypeError: b'<!DOCTYPE html>\n<html>\n <head>\n <title>Debian</title>\n </head>\n <body>\n <h1>Debian</h1>\n <p>This copy of Debian is not genuine.</p>\n <p>You may be a victim of software counterfeiting.</p>\n </body>\n</html>\n' is not JSON serializable
It looks to me like this is to do with the host.content field containing HTML and some further casting will need to be done to allow this to be serialized back to json. Given that some wordlists could be exceptionally large and I don't see many users utilising this data you could always drop this field from the output to minimize the data being written (writing everything for every scanned result is going to make a large output very quickly).
I've resolved one bug with this but it doesn't appear to work against our test bench when tested with Windows. Stack trace below:
It looks to me like this is to do with the
host.content
field containing HTML and some further casting will need to be done to allow this to be serialized back to json. Given that some wordlists could be exceptionally large and I don't see many users utilising this data you could always drop this field from the output to minimize the data being written (writing everything for every scanned result is going to make a large output very quickly).