puppylpg / oddish

Crawl csgo skin info from `buff.163.com` and steam, then find the most suitable one to buy from the former and to sell to the latter.
https://puppylpg.github.io/2019/12/07/python-crawler-buff-optimaze/
GNU General Public License v3.0
321 stars 80 forks source link

(10054, '远程主机强迫关闭了一个现有的连接.', None, 10054, None)) #37

Closed wbsddgg closed 3 years ago

wbsddgg commented 3 years ago

问题描述

UP主好 经过一夜的视频学习 还是没能正常使用oddish 出现了如上错误提示 首先 我用了pc雷神加速器exe 然后网页就可以开steam市场了。于是我把proxy = 后面内容全删了,理论上就是默认使用雷神加速器读取steam市场网页了吧? 开始几秒钟爬的很顺利 但是十秒钟之后就出现了(10054, '远程主机强迫关闭了一个现有的连接.', None, 10054, None))提示,我去网上搜索 也看不明白大家在说啥@@ 请问有什么解决办法吗? 奇怪的是 错误几次之后 还能爬打一个饰品 2020-11-08 13:36:10,615 [INFO ] GET steam history price 145/720 for (加利尔 AR(StatTrak™) | 喧闹骷髅 (破损不堪)): https://steamcommunity.com/market/pricehistory/?appid=730&market_hash_name=StatTrak%E2%84%A2%20Galil%20AR%20%7C%20Chatterbox%20%28Well-Worn%29 但是只能爬到一个, 这可得爬到2021年啊 望协助!

复现步骤

报错信息

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\requests\adapters.py", line 439, in send resp = conn.urlopen( File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 724, in urlopen retries = retries.increment( File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\util\retry.py", line 403, in increment raise six.reraise(type(error), error, _stacktrace) File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\packages\six.py", line 734, in reraise raise value.with_traceback(tb) File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 670, in urlopen httplib_response = self._make_request( File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 381, in _make_request self._validate_conn(conn) File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 976, in _validate_conn conn.connect() File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connection.py", line 361, in connect self.sock = ssl_wrapsocket( File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\util\ssl.py", line 377, in ssl_wrap_socket return context.wrap_socket(sock, server_hostname=server_hostname) File "C:\ProgramData\Anaconda3\lib\ssl.py", line 500, in wrap_socket return self.sslsocket_class._create( File "C:\ProgramData\Anaconda3\lib\ssl.py", line 1040, in _create self.do_handshake() File "C:\ProgramData\Anaconda3\lib\ssl.py", line 1309, in do_handshake self._sslobj.do_handshake() urllib3.exceptions.ProtocolError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Users\Charles\oddish-master\src\util\requester.py", line 49, in get_json_dict_raw return requests.get(url, headers = headers, cookies = cookies, timeout = 5).text File "C:\ProgramData\Anaconda3\lib\site-packages\requests\api.py", line 76, in get return request('get', url, params=params, kwargs) File "C:\ProgramData\Anaconda3\lib\site-packages\requests\api.py", line 61, in request return session.request(method=method, url=url, kwargs) File "C:\ProgramData\Anaconda3\lib\site-packages\requests\sessions.py", line 530, in request resp = self.send(prep, send_kwargs) File "C:\ProgramData\Anaconda3\lib\site-packages\requests\sessions.py", line 643, in send r = adapter.send(request, kwargs) File "C:\ProgramData\Anaconda3\lib\site-packages\requests\adapters.py", line 498, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

2020-11-08 13:39:23,262 [ERROR] Timeout for https://steamcommunity.com/market/pricehistory/?appid=730&market_hash_name=AWP%20%7C%20Wildfire%20%28Well-Worn%29 beyond the maximum(4) retry times. SKIP! 2020-11-08 13:39:23,263 [ERROR] Traceback (most recent call last): File "C:\Users\Charles\oddish-master\src\crawl\history_price_crawler.py", line 42, in crawl_history_price crawl_item_history_price(index, item, total_price_number) File "C:\Users\Charles\oddish-master\src\crawl\history_price_crawler.py", line 14, in crawl_item_history_price steam_history_prices = get_json_dict(steam_price_url, steam_cookies, True) File "C:\Users\Charles\oddish-master\src\util\requester.py", line 63, in get_json_dict store(url,json_data) File "C:\Users\Charles\oddish-master\src\util\cache.py", line 46, in store f.write(data) TypeError: write() argument must be str, not None

相关截屏(最好贴一下)

软件信息

请完善以下信息:

puppylpg commented 3 years ago

还是代理的问题,发生了超时timeout。

后面的报错是因为超时没爬到数据导致的,我更新了一下代码,可以再试下,不会报“TypeError: write() argument must be str, not None”这个错了 但是代理不行是真不行

wbsddgg commented 3 years ago

还是代理的问题,发生了超时timeout。

后面的报错是因为超时没爬到数据导致的,我更新了一下代码,可以再试下,不会报“TypeError: write() argument must be str, not None”这个错了 但是代理不行是真不行

得令 感谢更新 我休息的时候马上就用 感谢感谢~~