Open Zivch opened 6 days ago
這看起來像是拿到錯誤的 protocol︰
# error
file://i.hamreus.com/ps3/b/bt-535/lqsr/第28话/0001.jpg.webp?e=1732106351&m=XZ-T0svx8NCQrGecmXrhow
# should be
https://i.hamreus.com/ps3/b/bt-535/lqsr/第28话/0001.jpg.webp?e=1732106351&m=XZ-T0svx8NCQrGecmXrhow
我的PC上沒這問題。你的 ComicCrawler 版本是?可以執行︰
comiccrawler --version
我看到回覆後有用pip --upgrade-strategy eager更新一次,但還是無法下載。 version: 2024.11.14
範例網址:https://www.manhuagui.com/comic/49757/
log: Start analyzing https://www.manhuagui.com/comic/49757/ Analyzing success! Start downloading 兽王与药草 total 24 episode. Downloading ep 番外篇 一卷番外 Downloading 番外篇 一卷番外 page 1: file://i.hamreus.com/ps1/b/byh-13581/49757/一卷番外/202.jpg.webp?e=1732764870&m=hAlhIr5kMz2XGFuVT1szAQ
Traceback (most recent call last): File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\crawler.py", line 393, in error_loop process() File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\crawler.py", line 364, in download crawler.download_image() File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\crawler.py", line 98, in download_image result = self.downloader.img( ^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\module_grabber.py", line 18, in img return self.grab(grabimg, url, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\module_grabber.py", line 34, in grab return grab_method(url, newkwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\grabber.py", line 189, in grabimg r = grabber(*args, header=header, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\grabber.py", line 94, in grabber r = await(dorequest, s, url, proxies, retry, headers=header, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\worker__init__.py", line 943, in wrapped return f(args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\worker__init__.py", line 962, in await return async_(callback, *args, kwargs).get() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\worker__init.py", line 691, in get raise err File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\worker\init__.py", line 483, in wrap_worker self.ret = self.worker(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\grabber.py", line 108, in do_request r = s.request(kwargs.pop("method", "GET"), url, proxies=proxies, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\session_manager.py", line 25, in request return super().request(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\sessions.py", line 589, in request resp = self.send(prep, **send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\sessions.py", line 697, in send adapter = self.get_adapter(url=request.url) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\sessions.py", line 792, in get_adapter raise InvalidSchema(f"No connection adapters were found for {url!r}") requests.exceptions.InvalidSchema: No connection adapters were found for 'file://i.hamreus.com/ps1/b/byh-13581/49757/一 卷番外/202.jpg.webp?e=1732764870&m=hAlhIr5kMz2XGFuVT1szAQ' wait 10 seconds... Stop downloading 停止下載
可以用瀏覽器開啟網頁,但無法下載。 故應該不是被鎖ip的問題。
如果eight04大大有空的話,請幫忙確認一下。謝謝。
範例網頁: https://www.manhuagui.com/comic/48463/
錯誤代碼: Start downloading 猎奇杀人 total 35 episode. Downloading ep 单话 第28话 Downloading 单话 第28话 page 1: file://i.hamreus.com/ps3/b/bt-535/lqsr/第28话/0001.jpg.webp?e=1732106351&m=XZ-T0svx8NCQrGecmXrhow
Traceback (most recent call last): File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\crawler.py", line 392, in error_loop process() File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\crawler.py", line 363, in download crawler.download_image() File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\crawler.py", line 98, in download_image result = self.downloader.img( ^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\module_grabber.py", line 18, in img return self.grab(grabimg, url, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\module_grabber.py", line 34, in grab return grab_method(url, newkwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\grabber.py", line 188, in grabimg r = grabber(*args, header=header, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\grabber.py", line 94, in grabber r = await(dorequest, s, url, proxies, retry, headers=header, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\worker__init__.py", line 943, in wrapped return f(args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\worker__init__.py", line 962, in await return async_(callback, *args, kwargs).get() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\worker__init.py", line 691, in get raise err File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\worker\init__.py", line 483, in wrap_worker self.ret = self.worker(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\grabber.py", line 108, in do_request r = s.request(kwargs.pop("method", "GET"), url, proxies=proxies, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\comiccrawler\session_manager.py", line 25, in request return super().request(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\sessions.py", line 589, in request resp = self.send(prep, **send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\sessions.py", line 697, in send adapter = self.get_adapter(url=request.url) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\sessions.py", line 792, in get_adapter raise InvalidSchema(f"No connection adapters were found for {url!r}") requests.exceptions.InvalidSchema: No connection adapters were found for 'file://i.hamreus.com/ps3/b/bt-535/lqsr/第28话/0001.jpg.webp?e=1732106351&m=XZ-T0svx8NCQrGecmXrhow' wait 10 seconds...