eight04 / ComicCrawler

An image crawler written in Python.
265 stars 47 forks source link

下载漫画柜和copy漫画都出现问题 #321

Closed qewrwerwer closed 2 years ago

qewrwerwer commented 2 years ago

更新脚本后下漫画柜出现这个 C:\Users\ertetet>comiccrawler gui Update checking done Start analyzing https://www.manhuagui.com/comic/18715/ Thread crashed: <function DownloadManager.start_analyze..analyze_thread at 0x0000000003B73C18> Traceback (most recent call last): File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 706, in urlopen chunked=chunked, File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 382, in _make_re quest self._validate_conn(conn) File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 1010, in _valida te_conn conn.connect() File "c:\1\lib\site-packages\urllib3\connection.py", line 469, in connect _match_hostname(cert, self.assert_hostname or server_hostname) File "c:\1\lib\site-packages\urllib3\connection.py", line 542, in _match_hostn ame match_hostname(cert, asserted_hostname) File "c:\1\lib\ssl.py", line 334, in match_hostname % (hostname, ', '.join(map(repr, dnsnames)))) ssl.SSLCertVerificationError: ("hostname 'www.manhuagui.com' doesn't match eithe r of '*.expresswifi.com', 'expresswifi.com'",)

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "c:\1\lib\site-packages\requests\adapters.py", line 450, in send timeout=timeout File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 756, in urlopen method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2] File "c:\1\lib\site-packages\urllib3\util\retry.py", line 574, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='www.manhuagui.com', port=443): Max retries exceeded with url: /comic/18715/ (Caused by SSLError(SSLC ertVerificationError("hostname 'www.manhuagui.com' doesn't match either of '*.ex presswifi.com', 'expresswifi.com'")))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "c:\1\lib\site-packages\worker__init__.py", line 474, in wrap_worker self.ret = self.worker(*args, **kwargs) File "c:\1\lib\site-packages\comiccrawler\download_manager.py", line 163, in a nalyze_thread Analyzer(mission).analyze() File "c:\1\lib\site-packages\comiccrawler\analyzer.py", line 57, in analyze self.do_analyze() File "c:\1\lib\site-packages\comiccrawler\analyzer.py", line 80, in do_analyze

self.html = self.grabber.html(self.mission.url, retry=True)

File "c:\1\lib\site-packages\comiccrawler\module_grabber.py", line 11, in html

return self.grab(grabhtml, url, **kwargs)

File "c:\1\lib\site-packages\comiccrawler\module_grabber.py", line 30, in grab

return grab_method(url, **new_kwargs)

File "c:\1\lib\site-packages\comiccrawler\grabber.py", line 151, in grabhtml r = grabber(args, kwargs) File "c:\1\lib\site-packages\comiccrawler\grabber.py", line 105, in grabber r = await_(do_request, s, url, proxies, retry, kwargs) File "c:\1\lib\site-packages\worker__init__.py", line 905, in wrapped return f(args, kwargs) File "c:\1\lib\site-packages\worker__init.py", line 927, in await return async(callback, *args, **kwargs).get() File "c:\1\lib\site-packages\worker\init.py", line 682, in get raise err File "c:\1\lib\site-packages\worker\init__.py", line 474, in wrap_worker self.ret = self.worker(*args, kwargs) File "c:\1\lib\site-packages\comiccrawler\grabber.py", line 119, in do_request

proxies=proxies, **kwargs)

File "c:\1\lib\site-packages\requests\sessions.py", line 529, in request resp = self.send(prep, send_kwargs) File "c:\1\lib\site-packages\requests\sessions.py", line 645, in send r = adapter.send(request, kwargs) File "c:\1\lib\site-packages\requests\adapters.py", line 517, in send raise SSLError(e, request=request) requests.exceptions.SSLError: HTTPSConnectionPool(host='www.manhuagui.com', port =443): Max retries exceeded with url: /comic/18715/ (Caused by SSLError(SSLCertV erificationError("hostname 'www.manhuagui.com' doesn't match either of '*.expres swifi.com', 'expresswifi.com'"))) Session saved

下载copy漫画 出现这个