kevinzg / facebook-scraper

Scrape Facebook public pages without an API key
MIT License
2.47k stars 635 forks source link

requests.exceptions.ProxyError: HTTPSConnectionPool (urllib3.exceptions.MaxRetryError) #620

Open JJery-web opened 2 years ago

JJery-web commented 2 years ago

I have a question. Why I can run the code in one computer, but when I use the same code and same version of the package upon another computer, I can not get any result.

The error is

Traceback (most recent call last): File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 696, in urlopen self._prepare_proxy(conn) File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 964, in _prepare_proxy conn.connect() File "D:\Anaconda3\lib\site-packages\urllib3\connection.py", line 364, in connect conn = self._connect_tls_proxy(hostname, conn) File "D:\Anaconda3\lib\site-packages\urllib3\connection.py", line 501, in _connect_tls_proxy socket = ssl_wrapsocket( File "D:\Anaconda3\lib\site-packages\urllib3\util\ssl.py", line 453, in ssl_wrap_socket ssl_sock = _ssl_wrap_socket_impl(sock, context, tls_intls) File "D:\Anaconda3\lib\site-packages\urllib3\util\ssl.py", line 495, in _ssl_wrap_socket_impl return ssl_context.wrap_socket(sock) File "D:\Anaconda3\lib\ssl.py", line 500, in wrap_socket return self.sslsocket_class._create( File "D:\Anaconda3\lib\ssl.py", line 1040, in _create self.do_handshake() File "D:\Anaconda3\lib\ssl.py", line 1309, in do_handshake self._sslobj.do_handshake() socket.timeout: _ssl.c:1112: The handshake operation timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "D:\Anaconda3\lib\site-packages\requests\adapters.py", line 439, in send resp = conn.urlopen( File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 755, in urlopen retries = retries.increment( File "D:\Anaconda3\lib\site-packages\urllib3\util\retry.py", line 574, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='m.facebook.com', port=443): Max retries exceeded with url: /settings?locale=en_US (Caused by ProxyError('Cannot connect to proxy.', timeout('_ssl.c:1112: The handshake operation timed out')))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "E:\1231 social.py", line 103, in for post in get_posts(account=link, pages=None, timeout=120, cookies="mycookies2.json",options={"allow_extra_requests": False,"reactions":False,"posts_per_page": 300}): File "D:\Anaconda3\lib\site-packages\facebook_scraper__init__.py", line 172, in get_posts set_cookies(cookies) File "D:\Anaconda3\lib\site-packages\facebook_scraper__init__.py", line 44, in set_cookies if not _scraper.is_logged_in(): File "D:\Anaconda3\lib\site-packages\facebook_scraper\facebook_scraper.py", line 699, in is_logged_in self.get('https://m.facebook.com/settings') File "D:\Anaconda3\lib\site-packages\facebook_scraper\facebook_scraper.py", line 596, in get response = self.session.get(url=url, self.requests_kwargs, kwargs) File "D:\Anaconda3\lib\site-packages\requests\sessions.py", line 555, in get return self.request('GET', url, kwargs) File "D:\Anaconda3\lib\site-packages\requests\sessions.py", line 542, in request resp = self.send(prep, send_kwargs) File "D:\Anaconda3\lib\site-packages\requests\sessions.py", line 655, in send r = adapter.send(request, **kwargs) File "D:\Anaconda3\lib\site-packages\requests\adapters.py", line 510, in send raise ProxyError(e, request=request) requests.exceptions.ProxyError: HTTPSConnectionPool(host='m.facebook.com', port=443): Max retries exceeded with url: /settings?locale=en_US (Caused by ProxyError('Cannot connect to proxy.', timeout('_ssl.c:1112: The handshake operation timed out')))

JJery-web commented 2 years ago

I use my cookies to login, and I can view the content from chrome. So Why I can not get any posts? So confused. Especially I use the same code and method and I can get the results in another computer.

neon-ninja commented 2 years ago

It looks like the proxy isn't accessible from this computer. Perhaps there's some sort of IP based restriction, or firewall.