13o-bbr-bbq / machine_learning_security

Source code about machine learning and security.
1.96k stars 648 forks source link

When crawling a web port it goes bad #16

Closed cybert79 closed 5 years ago

cybert79 commented 6 years ago

2018-07-17 22:05:12 [scrapy.utils.log] INFO: Scrapy 1.5.0 started (bot: scrapybot) 2018-07-17 22:05:12 [scrapy.utils.log] INFO: Versions: lxml 4.2.3.0, libxml2 2.9.8, cssselect 1.0.3, parsel 1.5.0, w3lib 1.19.0, Twisted 18.7.0, Python 3.5.3 (default, Jan 19 2017, 14:11:04) - [GCC 6.3.0 20170118], pyOpenSSL 18.0.0 (OpenSSL 1.1.0h 27 Mar 2018), cryptographyform Linux-4.9.0-3-amd64-x86_64-with-debian-9.5 2018-07-17 22:05:12 [scrapy.crawler] INFO: Overridden settings: {'FEED_FORMAT': 'json', 'SPIDER_LOADER_WARN_ONLY': True, 'FEED_URI': 'crawl_result/20180717220511_crawl_result.json'} 2018-07-17 22:05:12 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.feedexport.FeedExporter', 'scrapy.extensions.logstats.LogStats', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.telnet.TelnetConsole'] [*] Save log to /opt/machine_learning_security/DeepExploit/crawl_result/some-ip1_80.log 2018-07-17 22:05:13 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2018-07-17 22:05:13 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2018-07-17 22:05:13 [scrapy.middleware] INFO: Enabled item pipelines: [] 2018-07-17 22:05:13 [scrapy.core.engine] INFO: Spider opened 2018-07-17 22:05:13 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2018-07-17 22:05:13 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023 2018-07-17 22:05:13 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://some-ip:80/> (referer: None) 2018-07-17 22:05:13 [scrapy.core.engine] INFO: Closing spider (finished) 2018-07-17 22:05:13 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 212, 'downloader/request_count': 1, 'downloader/request_method_count/GET': 1, 'downloader/response_bytes': 564, 'downloader/response_count': 1, 'downloader/response_status_count/200': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2018, 7, 17, 20, 5, 13, 150929), 'log_count/DEBUG': 2, 'log_count/INFO': 7, 'memusage/max': 55209984, 'memusage/startup': 55209984, 'response_received_count': 1, 'scheduler/dequeued': 1, 'scheduler/dequeued/memory': 1, 'scheduler/enqueued': 1, 'scheduler/enqueued/memory': 1, 'start_time': datetime.datetime(2018, 7, 17, 20, 5, 13, 10283)} 2018-07-17 22:05:13 [scrapy.core.engine] INFO: Spider closed (finished) Traceback (most recent call last): File "DeepExploit.py", line 2064, in target_tree = env.get_target_info(rhost, proto_list, info_list) File "DeepExploit.py", line 525, in get_target_info web_target_info = self.util.run_spider(rhost, web_port_list) File "/opt/machine_learning_security/DeepExploit/util.py", line 159, in run_spider dict_json = json.load(fin) File "/usr/lib/python3.5/json/init.py", line 268, in load parse_constant=parse_constant, object_pairs_hook=object_pairs_hook, **kw) File "/usr/lib/python3.5/json/init.py", line 319, in loads return _default_decoder.decode(s) File "/usr/lib/python3.5/json/decoder.py", line 339, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib/python3.5/json/decoder.py", line 357, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

cybert79 commented 6 years ago

Here is some more info on an ip I can publish

[+] Execute Nmap against 81.82.222.25 [*] nmap -Pn -sT -A -r --initial-rtt-timeout 300ms --min-rtt-timeout 200ms --max-rtt-timeout 1000ms --max-scan-delay 200ms --max-retries 3 -oX nmap_result_81.82.222.25.xml 81.82.222.25

[] Start time: 2018/07/17 22:07:31 [] Port scanning: 81.82.222.25 [Elapsed time: 0 s] [] Port scanning: 81.82.222.25 [Elapsed time: 5 s] [] Port scanning: 81.82.222.25 [Elapsed time: 10 s] [] Port scanning: 81.82.222.25 [Elapsed time: 15 s] [] Port scanning: 81.82.222.25 [Elapsed time: 20 s] [] Port scanning: 81.82.222.25 [Elapsed time: 25 s] [] Port scanning: 81.82.222.25 [Elapsed time: 30 s] [] Port scanning: 81.82.222.25 [Elapsed time: 35 s] [] Port scanning: 81.82.222.25 [Elapsed time: 40 s] [] Port scanning: 81.82.222.25 [Elapsed time: 45 s] [] Port scanning: 81.82.222.25 [Elapsed time: 50 s] [] Port scanning: 81.82.222.25 [Elapsed time: 55 s] [] Port scanning: 81.82.222.25 [Elapsed time: 60 s] [] Port scanning: 81.82.222.25 [Elapsed time: 65 s] [] Port scanning: 81.82.222.25 [Elapsed time: 70 s] [] Port scanning: 81.82.222.25 [Elapsed time: 75 s] [] Port scanning: 81.82.222.25 [Elapsed time: 80 s] [] Port scanning: 81.82.222.25 [Elapsed time: 85 s] [] Port scanning: 81.82.222.25 [Elapsed time: 90 s] [] Port scanning: 81.82.222.25 [Elapsed time: 95 s] [] Port scanning: 81.82.222.25 [Elapsed time: 100 s] [] Port scanning: 81.82.222.25 [Elapsed time: 105 s] [] End time : 2018/07/17 22:09:23 [+] Get port list from nmap_result_81.82.222.25.xml. [+] Get exploit list. [] Loading exploit list from local file: /opt/machine_learning_security/DeepExploit/data/exploit_list.csv [+] Get payload list. [] Loading payload list from local file: /opt/machine_learning_security/DeepExploit/data/payload_list.csv [+] Get exploit tree. [] Loading exploit tree from local file: /opt/machine_learning_security/DeepExploit/data/exploit_tree.json [+] Get target info. [+] Check web port. [] Target URL: http://81.82.222.25:81 [] Port "81" is web port. status=200 [] Target URL: http://81.82.222.25:1723 [!] Port "1723" is not web port. [] Target URL: https://81.82.222.25:1723 [!] Port "1723" is not web port. 2018-07-17 22:09:29 [scrapy.utils.log] INFO: Scrapy 1.5.0 started (bot: scrapybot) 2018-07-17 22:09:29 [scrapy.utils.log] INFO: Versions: lxml 4.2.3.0, libxml2 2.9.8, cssselect 1.0.3, parsel 1.5.0, w3lib 1.19.0, Twisted 18.7.0, Python 3.5.3 (default, Jan 19 2017, 14:11:04) - [GCC 6.3.0 20170118], pyOpenSSL 18.0.0 (OpenSSL 1.1.0h 27 Mar 2018), cryptography 2.2.2, Platform Linux-4.9.0-3-amd64-x86_64-with-debian-9.5 2018-07-17 22:09:29 [scrapy.crawler] INFO: Overridden settings: {'SPIDER_LOADER_WARN_ONLY': True, 'FEED_URI': 'crawl_result/20180717220928_crawl_result.json', 'FEED_FORMAT': 'json'} 2018-07-17 22:09:29 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.feedexport.FeedExporter', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats', 'scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.memusage.MemoryUsage'] [] Save log to /opt/machine_learning_security/DeepExploit/crawl_result/81.82.222.25_81.log 2018-07-17 22:09:29 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2018-07-17 22:09:29 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2018-07-17 22:09:29 [scrapy.middleware] INFO: Enabled item pipelines: [] 2018-07-17 22:09:29 [scrapy.core.engine] INFO: Spider opened 2018-07-17 22:09:29 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2018-07-17 22:09:29 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023 2018-07-17 22:09:29 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET http://81.82.222.25:81/login.htm?page=%2F> from <GET http://81.82.222.25:81/> 2018-07-17 22:09:29 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://81.82.222.25:81/login.htm?page=%2F> (referer: None) 2018-07-17 22:09:29 [scrapy.core.engine] INFO: Closing spider (finished) 2018-07-17 22:09:29 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 490, 'downloader/request_count': 2, 'downloader/request_method_count/GET': 2, 'downloader/response_bytes': 4589, 'downloader/response_count': 2, 'downloader/response_status_count/200': 1, 'downloader/response_status_count/302': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2018, 7, 17, 20, 9, 29, 508799), 'log_count/DEBUG': 3, 'log_count/INFO': 7, 'memusage/max': 55042048, 'memusage/startup': 55042048, 'response_received_count': 1, 'scheduler/dequeued': 2, 'scheduler/dequeued/memory': 2, 'scheduler/enqueued': 2, 'scheduler/enqueued/memory': 2, 'start_time': datetime.datetime(2018, 7, 17, 20, 9, 29, 201870)} 2018-07-17 22:09:29 [scrapy.core.engine] INFO: Spider closed (finished) Traceback (most recent call last): File "DeepExploit.py", line 2064, in target_tree = env.get_target_info(rhost, proto_list, info_list) File "DeepExploit.py", line 525, in get_target_info web_target_info = self.util.run_spider(rhost, web_port_list) File "/opt/machine_learning_security/DeepExploit/util.py", line 159, in run_spider dict_json = json.load(fin) File "/usr/lib/python3.5/json/init.py", line 268, in load parse_constant=parse_constant, object_pairs_hook=object_pairs_hook, **kw) File "/usr/lib/python3.5/json/init.py", line 319, in loads return _default_decoder.decode(s) File "/usr/lib/python3.5/json/decoder.py", line 339, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib/python3.5/json/decoder.py", line 357, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

13o-bbr-bbq commented 6 years ago

i correspond the null character problem of loading file. please, try again.

cybert79 commented 6 years ago

ok, So I do a git update first?

cybert79 commented 6 years ago

Did you add a function? :) [!] 1638/1792 unix/webapp/squirrelmail_pgp_plugin module is danger (rank: manual). Can't load.

cybert79 commented 6 years ago

It went a little bit better now, but still errors:

[] 1174/1174 exploit:linux/smtp/haraka, targets:2 [] Saved exploit tree. [+] Get target info. [+] Check web port. [] Target URL: http://81.82.222.25:81 [] Port "81" is web port. status=200 [] Target URL: http://81.82.222.25:1723 [!] Port "1723" is not web port. [] Target URL: https://81.82.222.25:1723 [!] Port "1723" is not web port. 2018-07-18 09:36:14 [scrapy.utils.log] INFO: Scrapy 1.5.0 started (bot: scrapybot) 2018-07-18 09:36:14 [scrapy.utils.log] INFO: Versions: lxml 4.2.3.0, libxml2 2.9.8, cssselect 1.0.3, parsel 1.5.0, w3lib 1.19.0, Twisted 18.7.0, Python 3.5.3 (default, Jan 19 2017, 14:11:04) - [GCC 6.3.0 20170118], pyOpenSSL 18.0.0 (OpenSSL 1.1.0h 27 Mar 2018), cryptography 2.2.2, Platform Linux-4.9.0-3-amd64-x86_64-with-debian-9.5 2018-07-18 09:36:14 [scrapy.crawler] INFO: Overridden settings: {'SPIDER_LOADER_WARN_ONLY': True, 'FEED_URI': 'crawl_result/20180718093613_crawl_result.json', 'FEED_FORMAT': 'json'} 2018-07-18 09:36:14 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.logstats.LogStats', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.feedexport.FeedExporter', 'scrapy.extensions.telnet.TelnetConsole'] [*] Save log to /opt/machine_learning_security/DeepExploit/crawl_result/81.82.222.25_81.log 2018-07-18 09:36:14 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2018-07-18 09:36:14 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2018-07-18 09:36:14 [scrapy.middleware] INFO: Enabled item pipelines: [] 2018-07-18 09:36:14 [scrapy.core.engine] INFO: Spider opened 2018-07-18 09:36:14 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2018-07-18 09:36:14 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023 2018-07-18 09:36:14 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET http://81.82.222.25:81/login.htm?page=%2F> from <GET http://81.82.222.25:81/> 2018-07-18 09:36:14 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://81.82.222.25:81/login.htm?page=%2F> (referer: None) 2018-07-18 09:36:14 [scrapy.core.engine] INFO: Closing spider (finished) 2018-07-18 09:36:14 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 490, 'downloader/request_count': 2, 'downloader/request_method_count/GET': 2, 'downloader/response_bytes': 4589, 'downloader/response_count': 2, 'downloader/response_status_count/200': 1, 'downloader/response_status_count/302': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2018, 7, 18, 7, 36, 14, 923712), 'log_count/DEBUG': 3, 'log_count/INFO': 7, 'memusage/max': 54947840, 'memusage/startup': 54947840, 'response_received_count': 1, 'scheduler/dequeued': 2, 'scheduler/dequeued/memory': 2, 'scheduler/enqueued': 2, 'scheduler/enqueued/memory': 2, 'start_time': datetime.datetime(2018, 7, 18, 7, 36, 14, 607720)} 2018-07-18 09:36:14 [scrapy.core.engine] INFO: Spider closed (finished) Traceback (most recent call last): File "DeepExploit.py", line 2082, in target_tree = env.get_target_info(rhost, proto_list, info_list) File "DeepExploit.py", line 525, in get_target_info web_target_info = self.util.run_spider(rhost, web_port_list) File "/opt/machine_learning_security/DeepExploit/util.py", line 159, in run_spider dict_json = json.loads(fin.read().replace('\0', '')) File "/usr/lib/python3.5/json/init.py", line 319, in loads return _default_decoder.decode(s) File "/usr/lib/python3.5/json/decoder.py", line 339, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib/python3.5/json/decoder.py", line 357, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

13o-bbr-bbq commented 6 years ago

Hi

Did you add a function? :) [!] 1638/1792 unix/webapp/squirrelmail_pgp_plugin module is danger (rank: manual). Can't load.

yes, i do.
this is no problem. deep exploit only use safety modules such as "rank=excellent, great, good".
in the future, i'll allow users to choose the rank.

It went a little bit better now, but still errors:

ugh..
this problem is solved in my environment.
i'm seeking cause of the problem, please just a moment.

cybert79 commented 6 years ago

Well, I just installed it on a debian 9 with python 3 You need more info about my system?

13o-bbr-bbq commented 6 years ago

now i don't need more info.
i released the updated module util.py. it can remove control character such as NUL, EOT, DEL.

if your environment cause error, please send me your yyyymmddhhmmss_crawl_result.jon.

cybert79 commented 6 years ago

Mmmm, still the same issue, and the json file is empty,...

2018-07-19 10:33:09 [scrapy.core.engine] INFO: Spider opened 2018-07-19 10:33:09 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2018-07-19 10:33:09 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023 2018-07-19 10:33:09 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET http://81.82.222.25:81/login.htm?page=%2F> from <GET http://81.82.222.25:81/> 2018-07-19 10:33:09 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://81.82.222.25:81/login.htm?page=%2F> (referer: None) 2018-07-19 10:33:09 [scrapy.core.engine] INFO: Closing spider (finished) 2018-07-19 10:33:09 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 490, 'downloader/request_count': 2, 'downloader/request_method_count/GET': 2, 'downloader/response_bytes': 4589, 'downloader/response_count': 2, 'downloader/response_status_count/200': 1, 'downloader/response_status_count/302': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2018, 7, 19, 8, 33, 9, 576632), 'log_count/DEBUG': 3, 'log_count/INFO': 7, 'memusage/max': 55201792, 'memusage/startup': 55201792, 'response_received_count': 1, 'scheduler/dequeued': 2, 'scheduler/dequeued/memory': 2, 'scheduler/enqueued': 2, 'scheduler/enqueued/memory': 2, 'start_time': datetime.datetime(2018, 7, 19, 8, 33, 9, 248249)} 2018-07-19 10:33:09 [scrapy.core.engine] INFO: Spider closed (finished) Traceback (most recent call last): File "DeepExploit.py", line 2094, in target_tree = env.get_target_info(rhost, proto_list, info_list) File "DeepExploit.py", line 539, in get_target_info web_target_info = self.util.run_spider(rhost, web_port_list) File "/opt/machine_learning_security/DeepExploit/util.py", line 169, in run_spider dict_json = json.loads(self.delete_ctrl_char(fin.read())) File "/usr/lib/python3.5/json/init.py", line 319, in loads return _default_decoder.decode(s) File "/usr/lib/python3.5/json/decoder.py", line 339, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib/python3.5/json/decoder.py", line 357, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) root@debian:/opt/machine_learning_security/DeepExploit# ls config.ini crawl_result CreateReport.py data DeepExploit.py deep_plugin img LICENSE pycache README.md report requirements.txt Spider.py trained_data util.py root@debian:/opt/machine_learning_security/DeepExploit# cd data/ root@debian:/opt/machine_learning_security/DeepExploit/data# ls exploit_list.csv exploit_tree.json payload_list.csv root@debian:/opt/machine_learning_security/DeepExploit/data# cd .. root@debian:/opt/machine_learning_security/DeepExploit# cd trained_data/ root@debian:/opt/machine_learning_security/DeepExploit/trained_data# ls checkpoint DeepExploit.ckpt.data-00000-of-00001 DeepExploit.ckpt.index DeepExploit.ckpt.meta

root@debian:/opt/machine_learning_security/DeepExploit/trained_data# cd .. root@debian:/opt/machine_learning_security/DeepExploit# cd crawl_result/ root@debian:/opt/machine_learning_security/DeepExploit/crawl_result# ls 20180719103307_crawl_result.json 81.82.222.25_81.log root@debian:/opt/machine_learning_security/DeepExploit/crawl_result# cat 20180719103307_crawl_result.json root@debian:/opt/machine_learning_security/DeepExploit/crawl_result#

cybert79 commented 6 years ago

here is the result of the other file in the crawl directory: the .log file

root@debian:/opt/machine_learning_security/DeepExploit/crawl_result# cat 81.82.222.25_81.log <!DOCTYPE html>

Blue Iris Login