s0md3v / Photon

Incredibly fast crawler designed for OSINT.
GNU General Public License v3.0
11.03k stars 1.52k forks source link

Don't know why it's not working #111

Closed fr33s0ul closed 5 years ago

fr33s0ul commented 5 years ago
  ____  __          __
 / __ \/ /_  ____  / /_____  ____
/ /_/ / __ \/ __ \/ __/ __ \/ __ \

/ / / / / // / // // / / / / // // //\/_/____// /_/ v1.2.1

Traceback (most recent call last): File "photon.py", line 266, in zap(main_url, args.archive, domain, host, internal, robots) File "/pentest/Photon/core/zap.py", line 23, in zap response = requests.get(inputUrl + '/robots.txt').text File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 72, in get return request('get', url, params=params, kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 58, in request return session.request(method=method, url=url, kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 508, in request resp = self.send(prep, send_kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 618, in send r = adapter.send(request, kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 508, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPSConnectionPool(host='www.xxxxxxxxxxxxxx.com', port=443): Max retries exceeded with url: /robots.txt (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7f343a97aad0>: Failed to establish a new connection: [Errno -2] Name or service not known',))

dataexfiltration commented 5 years ago

Use python3 photon.py and it should work