Closed Melaru closed 1 year ago
im getting this also
Hello, I updated the source code and now it seems to work fine. Thanks for opening the issue.
Hello, I updated the source code and now it seems to work fine. Thanks for opening the issue.
still doesn't work
Hello, I updated the source code and now it seems to work fine. Thanks for opening the issue.
still doesn't work
Hello, May I know what command you are running?
Hello, I updated the source code and now it seems to work fine. Thanks for opening the issue.
still doesn't work
Hello, May I know what command you are running?
I don't even know brez, you got discord? Kaden <3#3957 add me and help me brez
I don't even know brez, you got discord? Kaden <3#3957 add me and help me brez
Why you can't run it? Just open CMD or Terminal and run commands as I mentioned on README.
ok
On Wed, Sep 1, 2021 at 3:11 AM Nima @.***> wrote:
I don't even know brez, you got discord? Kaden <3#3957 add me and help me brez Why you can't run it? Just open CMD or Terminal and run commands as I mentioned on README.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/iw4p/proxy-scraper/issues/6#issuecomment-910139404, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALIMEGI5C4GTVHPULHHGZ6DT7X363ANCNFSM5CMZTBUA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.
Exception in thread Thread-2: Traceback (most recent call last): File "C:\python3\lib\threading.py", line 973, in _bootstrap_inner self.run() File "C:\python3\lib\threading.py", line 910, in run self._target(*self._args, **self._kwargs) File "C:\Users\mike\Desktop\net\proxy-scraper-1.0\proxyScraper.py", line 57, in scrapeproxies result = proxyscrape(table = soup.find('table', attrs={'id': 'proxylisttable'})) File "C:\Users\mike\Desktop\net\proxy-scraper-1.0\proxyScraper.py", line 42, in proxyscrape for row in table.findAll('tr'): AttributeError: 'NoneType' object has no attribute 'findAll' Exception in thread Thread-4: Traceback (most recent call last): File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 665, in urlopen httplib_response = self._make_request( File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 376, in _make_request self._validate_conn(conn) File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 994, in _validate_conn conn.connect() File "C:\python3\lib\site-packages\urllib3\connection.py", line 386, in connect self.sock = ssl_wrapsocket( File "C:\python3\lib\site-packages\urllib3\util\ssl.py", line 370, in ssl_wrap_socket return context.wrap_socket(sock, server_hostname=server_hostname) File "C:\python3\lib\ssl.py", line 500, in wrap_socket return self.sslsocket_class._create( File "C:\python3\lib\ssl.py", line 1040, in _create self.do_handshake() File "C:\python3\lib\ssl.py", line 1309, in do_handshake self._sslobj.do_handshake() ssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:1129)
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "C:\python3\lib\site-packages\requests\adapters.py", line 439, in send resp = conn.urlopen( File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 719, in urlopen retries = retries.increment( File "C:\python3\lib\site-packages\urllib3\util\retry.py", line 436, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='www.proxy-list.download', port=443): Max retries exceeded with url: /api/v1/get?type=http&anon=elite (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1129)')))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\python3\lib\threading.py", line 973, in _bootstrap_inner
self.run()
File "C:\python3\lib\threading.py", line 910, in run
self._target(*self._args, self._kwargs)
File "C:\Users\mike\Desktop\net\proxy-scraper-1.0\proxyScraper.py", line 23, in proxyListDownloadScraper
html = session.get(url).text
File "C:\python3\lib\site-packages\requests\sessions.py", line 546, in get
return self.request('GET', url, kwargs)
File "C:\python3\lib\site-packages\requests\sessions.py", line 533, in request
resp = self.send(prep, send_kwargs)
File "C:\python3\lib\site-packages\requests\sessions.py", line 646, in send
r = adapter.send(request, kwargs)
File "C:\python3\lib\site-packages\requests\adapters.py", line 514, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='www.proxy-list.download', port=443): Max retries exceeded with url: /api/v1/get?type=http&anon=elite (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1129)')))
Exception in thread Thread-1:
Traceback (most recent call last):
File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 665, in urlopen
httplib_response = self._make_request(
File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 421, in _make_request
six.raise_from(e, None)
File "
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\python3\lib\site-packages\requests\adapters.py", line 439, in send
resp = conn.urlopen(
File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 719, in urlopen
retries = retries.increment(
File "C:\python3\lib\site-packages\urllib3\util\retry.py", line 400, in increment
raise six.reraise(type(error), error, _stacktrace)
File "C:\python3\lib\site-packages\urllib3\packages\six.py", line 734, in reraise
raise value.with_traceback(tb)
File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 665, in urlopen
httplib_response = self._make_request(
File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 421, in _make_request
six.raise_from(e, None)
File "
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "C:\python3\lib\threading.py", line 973, in _bootstrap_inner self.run() File "C:\python3\lib\threading.py", line 910, in run self._target(*self._args, self._kwargs) File "C:\Users\mike\Desktop\net\proxy-scraper-1.0\proxyScraper.py", line 56, in scrapeproxies soup=makesoup(url) File "C:\Users\mike\Desktop\net\proxy-scraper-1.0\proxyScraper.py", line 35, in makesoup page=requests.get(url) File "C:\python3\lib\site-packages\requests\api.py", line 75, in get return request('get', url, params=params, kwargs) File "C:\python3\lib\site-packages\requests\api.py", line 60, in request return session.request(method=method, url=url, kwargs) File "C:\python3\lib\site-packages\requests\sessions.py", line 533, in request resp = self.send(prep, send_kwargs) File "C:\python3\lib\site-packages\requests\sessions.py", line 646, in send r = adapter.send(request, **kwargs) File "C:\python3\lib\site-packages\requests\adapters.py", line 498, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
Object has no attribute findAll
AttributeError: 'NoneType' object has no attribute 'findAll'
same question
Did you try the new version of the script? because I update it.
Object has no attribute findAll