eight04 / ComicCrawler

An image crawler written in Python.
267 stars 47 forks source link

下载漫画柜和copy漫画都出现问题 #322

Closed qewrwerwer closed 2 years ago

qewrwerwer commented 2 years ago

更新脚本后下漫画柜出现这个

C:\Users\ertetet>comiccrawler gui
Update checking done
Start analyzing https://www.manhuagui.com/comic/18715/
Thread crashed: <function DownloadManager.start_analyze.<locals>.analyze_thread
at 0x0000000003B73C18>
Traceback (most recent call last):
  File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 706, in urlopen
    chunked=chunked,
  File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 382, in _make_re
quest
    self._validate_conn(conn)
  File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 1010, in _valida
te_conn
    conn.connect()
  File "c:\1\lib\site-packages\urllib3\connection.py", line 469, in connect
    _match_hostname(cert, self.assert_hostname or server_hostname)
  File "c:\1\lib\site-packages\urllib3\connection.py", line 542, in _match_hostn
ame
    match_hostname(cert, asserted_hostname)
  File "c:\1\lib\ssl.py", line 334, in match_hostname
    % (hostname, ', '.join(map(repr, dnsnames))))
ssl.SSLCertVerificationError: ("hostname 'www.manhuagui.com' doesn't match eithe
r of '*.expresswifi.com', 'expresswifi.com'",)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "c:\1\lib\site-packages\requests\adapters.py", line 450, in send
    timeout=timeout
  File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 756, in urlopen
    method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
  File "c:\1\lib\site-packages\urllib3\util\retry.py", line 574, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='www.manhuagui.com',
port=443): Max retries exceeded with url: /comic/18715/ (Caused by SSLError(SSLC
ertVerificationError("hostname 'www.manhuagui.com' doesn't match either of '*.ex
presswifi.com', 'expresswifi.com'")))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "c:\1\lib\site-packages\worker\__init__.py", line 474, in wrap_worker
    self.ret = self.worker(*args, **kwargs)
  File "c:\1\lib\site-packages\comiccrawler\download_manager.py", line 163, in a
nalyze_thread
    Analyzer(mission).analyze()
  File "c:\1\lib\site-packages\comiccrawler\analyzer.py", line 57, in analyze
    self.do_analyze()
  File "c:\1\lib\site-packages\comiccrawler\analyzer.py", line 80, in do_analyze

    self.html = self.grabber.html(self.mission.url, retry=True)
  File "c:\1\lib\site-packages\comiccrawler\module_grabber.py", line 11, in html

    return self.grab(grabhtml, url, **kwargs)
  File "c:\1\lib\site-packages\comiccrawler\module_grabber.py", line 30, in grab

    return grab_method(url, **new_kwargs)
  File "c:\1\lib\site-packages\comiccrawler\grabber.py", line 151, in grabhtml
    r = grabber(*args, **kwargs)
  File "c:\1\lib\site-packages\comiccrawler\grabber.py", line 105, in grabber
    r = await_(do_request, s, url, proxies, retry, **kwargs)
  File "c:\1\lib\site-packages\worker\__init__.py", line 905, in wrapped
    return f(*args, **kwargs)
  File "c:\1\lib\site-packages\worker\__init__.py", line 927, in await_
    return async_(callback, *args, **kwargs).get()
  File "c:\1\lib\site-packages\worker\__init__.py", line 682, in get
    raise err
  File "c:\1\lib\site-packages\worker\__init__.py", line 474, in wrap_worker
    self.ret = self.worker(*args, **kwargs)
  File "c:\1\lib\site-packages\comiccrawler\grabber.py", line 119, in do_request

    proxies=proxies, **kwargs)
  File "c:\1\lib\site-packages\requests\sessions.py", line 529, in request
    resp = self.send(prep, **send_kwargs)
  File "c:\1\lib\site-packages\requests\sessions.py", line 645, in send
    r = adapter.send(request, **kwargs)
  File "c:\1\lib\site-packages\requests\adapters.py", line 517, in send
    raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='www.manhuagui.com', port
=443): Max retries exceeded with url: /comic/18715/ (Caused by SSLError(SSLCertV
erificationError("hostname 'www.manhuagui.com' doesn't match either of '*.expres
swifi.com', 'expresswifi.com'")))
Session saved

下载copy漫画 出现这个

C:\Users\ertetet>comiccrawler gui
Start analyzing https://copymanga.com/comic/nvzhuangcanjiaxianxiajuhuidehua
Analyzing success!
Start download 女裝參加線下聚會的話...
Start downloading 女裝參加線下聚會的話...
total 84 episode.
Downloading ep 第01话
Downloading 第01话 page 1: https://mirror277.mangafuna.xyz:12001/comic/nvzhuangc
anjiaxianxiajuhuidehua/ce2fa/6634099e-3e0c-11ea-ac9c-00163e0ca5bd.jpg!kb_w_read_
large

Traceback (most recent call last):
  File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 706, in urlopen
    chunked=chunked,
  File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 382, in _make_re
quest
    self._validate_conn(conn)
  File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 1010, in _valida
te_conn
    conn.connect()
  File "c:\1\lib\site-packages\urllib3\connection.py", line 426, in connect
    tls_in_tls=tls_in_tls,
  File "c:\1\lib\site-packages\urllib3\util\ssl_.py", line 450, in ssl_wrap_sock
et
    sock, context, tls_in_tls, server_hostname=server_hostname
  File "c:\1\lib\site-packages\urllib3\util\ssl_.py", line 493, in _ssl_wrap_soc
ket_impl
    return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
  File "c:\1\lib\ssl.py", line 423, in wrap_socket
    session=session
  File "c:\1\lib\ssl.py", line 870, in _create
    self.do_handshake()
  File "c:\1\lib\ssl.py", line 1139, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verif
y failed: unable to get local issuer certificate (_ssl.c:1076)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "c:\1\lib\site-packages\requests\adapters.py", line 450, in send
    timeout=timeout
  File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 756, in urlopen
    method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
  File "c:\1\lib\site-packages\urllib3\util\retry.py", line 574, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='mirror277.mangafuna.
xyz', port=12001): Max retries exceeded with url: /comic/nvzhuangcanjiaxianxiaju
huidehua/ce2fa/6634099e-3e0c-11ea-ac9c-00163e0ca5bd.jpg!kb_w_read_large (Caused
by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certif
icate verify failed: unable to get local issuer certificate (_ssl.c:1076)')))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "c:\1\lib\site-packages\comiccrawler\crawler.py", line 338, in error_loop

    process()
  File "c:\1\lib\site-packages\comiccrawler\crawler.py", line 311, in download
    crawler.download_image()
  File "c:\1\lib\site-packages\comiccrawler\crawler.py", line 82, in download_im
age
    referer=None if getattr(self.mod, "no_referer", False) else self.ep.current_
url
  File "c:\1\lib\site-packages\comiccrawler\module_grabber.py", line 14, in img
    return self.grab(grabimg, url, **kwargs)
  File "c:\1\lib\site-packages\comiccrawler\module_grabber.py", line 30, in grab

    return grab_method(url, **new_kwargs)
  File "c:\1\lib\site-packages\comiccrawler\grabber.py", line 208, in grabimg
    return ImgResult(grabber(*args, **kwargs))
  File "c:\1\lib\site-packages\comiccrawler\grabber.py", line 105, in grabber
    r = await_(do_request, s, url, proxies, retry, **kwargs)
  File "c:\1\lib\site-packages\worker\__init__.py", line 905, in wrapped
    return f(*args, **kwargs)
  File "c:\1\lib\site-packages\worker\__init__.py", line 927, in await_
    return async_(callback, *args, **kwargs).get()
  File "c:\1\lib\site-packages\worker\__init__.py", line 682, in get
    raise err
  File "c:\1\lib\site-packages\worker\__init__.py", line 474, in wrap_worker
    self.ret = self.worker(*args, **kwargs)
  File "c:\1\lib\site-packages\comiccrawler\grabber.py", line 119, in do_request

    proxies=proxies, **kwargs)
  File "c:\1\lib\site-packages\requests\sessions.py", line 529, in request
    resp = self.send(prep, **send_kwargs)
  File "c:\1\lib\site-packages\requests\sessions.py", line 645, in send
    r = adapter.send(request, **kwargs)
  File "c:\1\lib\site-packages\requests\adapters.py", line 517, in send
    raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='mirror277.mangafuna.xyz'
, port=12001): Max retries exceeded with url: /comic/nvzhuangcanjiaxianxiajuhuid
ehua/ce2fa/6634099e-3e0c-11ea-ac9c-00163e0ca5bd.jpg!kb_w_read_large (Caused by S
SLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificat
e verify failed: unable to get local issuer certificate (_ssl.c:1076)')))
Downloading 第01话 page 1: https://mirror277.mangafuna.xyz:12001/comic/nvzhuangc
anjiaxianxiajuhuidehua/ce2fa/6634099e-3e0c-11ea-ac9c-00163e0ca5bd.jpg!kb_w_read_
large

Traceback (most recent call last):
  File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 706, in urlopen
    chunked=chunked,
  File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 382, in _make_re
quest
    self._validate_conn(conn)
  File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 1010, in _valida
te_conn
    conn.connect()
  File "c:\1\lib\site-packages\urllib3\connection.py", line 426, in connect
    tls_in_tls=tls_in_tls,
  File "c:\1\lib\site-packages\urllib3\util\ssl_.py", line 450, in ssl_wrap_sock
et
    sock, context, tls_in_tls, server_hostname=server_hostname
  File "c:\1\lib\site-packages\urllib3\util\ssl_.py", line 493, in _ssl_wrap_soc
ket_impl
    return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
  File "c:\1\lib\ssl.py", line 423, in wrap_socket
    session=session
  File "c:\1\lib\ssl.py", line 870, in _create
    self.do_handshake()
  File "c:\1\lib\ssl.py", line 1139, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verif
y failed: unable to get local issuer certificate (_ssl.c:1076)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "c:\1\lib\site-packages\requests\adapters.py", line 450, in send
    timeout=timeout
  File "c:\1\lib\site-packages\urllib3\connectionpool.py", line 756, in urlopen
    method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
  File "c:\1\lib\site-packages\urllib3\util\retry.py", line 574, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='mirror277.mangafuna.
xyz', port=12001): Max retries exceeded with url: /comic/nvzhuangcanjiaxianxiaju
huidehua/ce2fa/6634099e-3e0c-11ea-ac9c-00163e0ca5bd.jpg!kb_w_read_large (Caused
by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certif
icate verify failed: unable to get local issuer certificate (_ssl.c:1076)')))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "c:\1\lib\site-packages\comiccrawler\crawler.py", line 338, in error_loop

    process()
  File "c:\1\lib\site-packages\comiccrawler\crawler.py", line 311, in download
    crawler.download_image()
  File "c:\1\lib\site-packages\comiccrawler\crawler.py", line 82, in download_im
age
    referer=None if getattr(self.mod, "no_referer", False) else self.ep.current_
url
  File "c:\1\lib\site-packages\comiccrawler\module_grabber.py", line 14, in img
    return self.grab(grabimg, url, **kwargs)
  File "c:\1\lib\site-packages\comiccrawler\module_grabber.py", line 30, in grab

    return grab_method(url, **new_kwargs)
  File "c:\1\lib\site-packages\comiccrawler\grabber.py", line 208, in grabimg
    return ImgResult(grabber(*args, **kwargs))
  File "c:\1\lib\site-packages\comiccrawler\grabber.py", line 105, in grabber
    r = await_(do_request, s, url, proxies, retry, **kwargs)
  File "c:\1\lib\site-packages\worker\__init__.py", line 905, in wrapped
    return f(*args, **kwargs)
  File "c:\1\lib\site-packages\worker\__init__.py", line 927, in await_
    return async_(callback, *args, **kwargs).get()
  File "c:\1\lib\site-packages\worker\__init__.py", line 682, in get
    raise err
  File "c:\1\lib\site-packages\worker\__init__.py", line 474, in wrap_worker
    self.ret = self.worker(*args, **kwargs)
  File "c:\1\lib\site-packages\comiccrawler\grabber.py", line 119, in do_request

    proxies=proxies, **kwargs)
  File "c:\1\lib\site-packages\requests\sessions.py", line 529, in request
    resp = self.send(prep, **send_kwargs)
  File "c:\1\lib\site-packages\requests\sessions.py", line 645, in send
    r = adapter.send(request, **kwargs)
  File "c:\1\lib\site-packages\requests\adapters.py", line 517, in send
    raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='mirror277.mangafuna.xyz'
, port=12001): Max retries exceeded with url: /comic/nvzhuangcanjiaxianxiajuhuid
ehua/ce2fa/6634099e-3e0c-11ea-ac9c-00163e0ca5bd.jpg!kb_w_read_large (Caused by S
SLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificat
e verify failed: unable to get local issuer certificate (_ssl.c:1076)')))
eight04 commented 2 years ago

copy好像掛了 image

qewrwerwer commented 2 years ago

copy修好了,manhuagui 我这还是出现MaxRetryError: HTTPSConnectionPool(host='www.manhuagui.com', port=443): Max retries exceeded with url: /comic/18715/ (Caused by SSLError(SSLC ertVerificationError("hostname 'www.manhuagui.com' doesn't match either of '*.ex presswifi.com', 'expresswifi.com'" 但是mhgu就没事 奇怪

eight04 commented 2 years ago

試試在設定檔中把 SSL 驗證關閉︰

[看漫畫]
verify=false