iw4p / proxy-scraper

scrape proxies from more than 5 different sources and check which ones are still alive
MIT License
516 stars 132 forks source link

FindAll #6

Closed Melaru closed 1 year ago

Melaru commented 3 years ago

Object has no attribute findAll

Samael-urmommy commented 3 years ago

im getting this also

iw4p commented 3 years ago

Hello, I updated the source code and now it seems to work fine. Thanks for opening the issue.

kadenqr commented 3 years ago

Hello, I updated the source code and now it seems to work fine. Thanks for opening the issue.

still doesn't work

iw4p commented 3 years ago

Hello, I updated the source code and now it seems to work fine. Thanks for opening the issue.

still doesn't work

Hello, May I know what command you are running?

kadenqr commented 3 years ago

Hello, I updated the source code and now it seems to work fine. Thanks for opening the issue.

still doesn't work

Hello, May I know what command you are running?

I don't even know brez, you got discord? Kaden <3#3957 add me and help me brez

iw4p commented 3 years ago

I don't even know brez, you got discord? Kaden <3#3957 add me and help me brez

Why you can't run it? Just open CMD or Terminal and run commands as I mentioned on README.

kadenqr commented 3 years ago

ok

On Wed, Sep 1, 2021 at 3:11 AM Nima @.***> wrote:

I don't even know brez, you got discord? Kaden <3#3957 add me and help me brez Why you can't run it? Just open CMD or Terminal and run commands as I mentioned on README.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/iw4p/proxy-scraper/issues/6#issuecomment-910139404, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALIMEGI5C4GTVHPULHHGZ6DT7X363ANCNFSM5CMZTBUA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

xy0893 commented 3 years ago

Exception in thread Thread-2: Traceback (most recent call last): File "C:\python3\lib\threading.py", line 973, in _bootstrap_inner self.run() File "C:\python3\lib\threading.py", line 910, in run self._target(*self._args, **self._kwargs) File "C:\Users\mike\Desktop\net\proxy-scraper-1.0\proxyScraper.py", line 57, in scrapeproxies result = proxyscrape(table = soup.find('table', attrs={'id': 'proxylisttable'})) File "C:\Users\mike\Desktop\net\proxy-scraper-1.0\proxyScraper.py", line 42, in proxyscrape for row in table.findAll('tr'): AttributeError: 'NoneType' object has no attribute 'findAll' Exception in thread Thread-4: Traceback (most recent call last): File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 665, in urlopen httplib_response = self._make_request( File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 376, in _make_request self._validate_conn(conn) File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 994, in _validate_conn conn.connect() File "C:\python3\lib\site-packages\urllib3\connection.py", line 386, in connect self.sock = ssl_wrapsocket( File "C:\python3\lib\site-packages\urllib3\util\ssl.py", line 370, in ssl_wrap_socket return context.wrap_socket(sock, server_hostname=server_hostname) File "C:\python3\lib\ssl.py", line 500, in wrap_socket return self.sslsocket_class._create( File "C:\python3\lib\ssl.py", line 1040, in _create self.do_handshake() File "C:\python3\lib\ssl.py", line 1309, in do_handshake self._sslobj.do_handshake() ssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:1129)

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\python3\lib\site-packages\requests\adapters.py", line 439, in send resp = conn.urlopen( File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 719, in urlopen retries = retries.increment( File "C:\python3\lib\site-packages\urllib3\util\retry.py", line 436, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='www.proxy-list.download', port=443): Max retries exceeded with url: /api/v1/get?type=http&anon=elite (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1129)')))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\python3\lib\threading.py", line 973, in _bootstrap_inner self.run() File "C:\python3\lib\threading.py", line 910, in run self._target(*self._args, self._kwargs) File "C:\Users\mike\Desktop\net\proxy-scraper-1.0\proxyScraper.py", line 23, in proxyListDownloadScraper html = session.get(url).text File "C:\python3\lib\site-packages\requests\sessions.py", line 546, in get return self.request('GET', url, kwargs) File "C:\python3\lib\site-packages\requests\sessions.py", line 533, in request resp = self.send(prep, send_kwargs) File "C:\python3\lib\site-packages\requests\sessions.py", line 646, in send r = adapter.send(request, kwargs) File "C:\python3\lib\site-packages\requests\adapters.py", line 514, in send raise SSLError(e, request=request) requests.exceptions.SSLError: HTTPSConnectionPool(host='www.proxy-list.download', port=443): Max retries exceeded with url: /api/v1/get?type=http&anon=elite (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1129)'))) Exception in thread Thread-1: Traceback (most recent call last): File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 665, in urlopen httplib_response = self._make_request( File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 421, in _make_request six.raise_from(e, None) File "", line 3, in raise_from File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 416, in _make_request httplib_response = conn.getresponse() File "C:\python3\lib\http\client.py", line 1349, in getresponse response.begin() File "C:\python3\lib\http\client.py", line 316, in begin version, status, reason = self._read_status() File "C:\python3\lib\http\client.py", line 285, in _read_status raise RemoteDisconnected("Remote end closed connection without" http.client.RemoteDisconnected: Remote end closed connection without response

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\python3\lib\site-packages\requests\adapters.py", line 439, in send resp = conn.urlopen( File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 719, in urlopen retries = retries.increment( File "C:\python3\lib\site-packages\urllib3\util\retry.py", line 400, in increment raise six.reraise(type(error), error, _stacktrace) File "C:\python3\lib\site-packages\urllib3\packages\six.py", line 734, in reraise raise value.with_traceback(tb) File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 665, in urlopen httplib_response = self._make_request( File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 421, in _make_request six.raise_from(e, None) File "", line 3, in raise_from File "C:\python3\lib\site-packages\urllib3\connectionpool.py", line 416, in _make_request httplib_response = conn.getresponse() File "C:\python3\lib\http\client.py", line 1349, in getresponse response.begin() File "C:\python3\lib\http\client.py", line 316, in begin version, status, reason = self._read_status() File "C:\python3\lib\http\client.py", line 285, in _read_status raise RemoteDisconnected("Remote end closed connection without" urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\python3\lib\threading.py", line 973, in _bootstrap_inner self.run() File "C:\python3\lib\threading.py", line 910, in run self._target(*self._args, self._kwargs) File "C:\Users\mike\Desktop\net\proxy-scraper-1.0\proxyScraper.py", line 56, in scrapeproxies soup=makesoup(url) File "C:\Users\mike\Desktop\net\proxy-scraper-1.0\proxyScraper.py", line 35, in makesoup page=requests.get(url) File "C:\python3\lib\site-packages\requests\api.py", line 75, in get return request('get', url, params=params, kwargs) File "C:\python3\lib\site-packages\requests\api.py", line 60, in request return session.request(method=method, url=url, kwargs) File "C:\python3\lib\site-packages\requests\sessions.py", line 533, in request resp = self.send(prep, send_kwargs) File "C:\python3\lib\site-packages\requests\sessions.py", line 646, in send r = adapter.send(request, **kwargs) File "C:\python3\lib\site-packages\requests\adapters.py", line 498, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

Object has no attribute findAll
AttributeError: 'NoneType' object has no attribute 'findAll' same question

iw4p commented 3 years ago

Did you try the new version of the script? because I update it.