Closed Ctory-Nily closed 2 months ago
2024-06-23 20:28:10 manga_image_translator_cpu | ERROR: [web_client] OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like kha-white/manga-ocr-base is not the path to a directory containing a file named preprocessor_config.json. 2024-06-23 20:28:10 manga_image_translator_cpu | Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
新版本已解决
Issue
无论是挂全局代理还是裸连都连接失败 2024-06-23 19:55:14 manga_image_translator_cpu | 2024-06-23 11:55:14.913 | INFO | manga_ocr.ocr:init:13 - Loading OCR model from kha-white/manga-ocr-base
同样在我使用m2m100翻译模型的时候也会发生超时的问题,挂梯子直接下载会直接失败,不挂梯子能下但是很慢,下载一段时间后就会显示超时,我尝试过将文件下载到本地后上传到国内的下载网站上,然后再替换链接,可还是会报错。
我不知道改怎么修改代码来让容器直接使用我下载好的模型。甚至我连 在选择 m2m100翻译 后开始翻译并获取到下载链接开始下载的那个启动下载的文件都不知道在哪(我知道他不是manga_translator/translators/m2m100.py)
Command Line Arguments
Console logs