InternLM / InternLM-XComposer

InternLM-XComposer2 is a groundbreaking vision-language large model (VLLM) excelling in free-form text-image composition and comprehension.
1.91k stars 120 forks source link

我的机器无法连接到huggingface,但是下载了离线clip-vit无法使用 #295

Closed deku0818 closed 2 months ago

deku0818 commented 2 months ago

我修改了文件 vim /root/.cache/huggingface/modules/transformers_modules/build_mlp.py 并且将vision_tower = 'openai/clip-vit-large-patch14-336' 替换为了本地路径,但是再次执行就被重写覆盖了

yuhangzang commented 2 months ago

You may (1) download the xcomposer2-vl-7b HF repo into your local path, (2) modify the build_mlp.py in your local repo and (3)replace the model = AutoModel.from_pretrained('internlm/internlm-xcomposer2-vl-7b') with model = AutoModel.from_pretrained('xxx'), where xxx is your local model path.

deku0818 commented 2 months ago

You may (1) download the HF repo into your local path, (2) modify the build_mlp.py and (3)replace the model = AutoModel.from_pretrained('internlm/internlm-xcomposer2-4khd-7b') with model = AutoModel.from_pretrained('xxx'), where xxx is your local model path.您可以(1)将 HF 仓库下载到本地路径,(2)修改 build_mlp.py,并(3)将 model = AutoModel.from_pretrained('internlm/internlm-xcomposer2-4khd-7b') 替换为 model = AutoModel.from_pretrained('xxx') ,其中 xxx 是您的本地模型路径。

image 我指的是openai/clip-vit-large-patch14-336,我在运行internlm-xcomposer2-vl-7b报错了 raise EnvironmentError( OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like openai/clip-vit-large-patch14-336 is not the path to a directory containing a file named config.json. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'. image

yuhangzang commented 2 months ago

Please download the xcomposer2-vl-7b HF repo into your local path, otherwise, your modification will be overwritten.

deku0818 commented 2 months ago

Please download the xcomposer2-vl-7b HF repo into your local path, otherwise, your modification will be overwritten.请将 xcomposer2-vl-7b HF 仓库下载到您的本地路径,否则,您的修改将被覆盖。

是的,做上述操作之前我已经下载到了本地路径.. image

deku0818 commented 2 months ago

Please download the xcomposer2-vl-7b HF repo into your local path, otherwise, your modification will be overwritten.请将 xcomposer2-vl-7b HF 仓库下载到您的本地路径,否则,您的修改将被覆盖。

依旧会被覆盖,我也尝试了用LMDeploy 部署,依旧失败 image

tianyour commented 1 month ago

请将 xcomposer2-vl-7b HF 仓库下载到您的本地路径,否则,您的修改将被覆盖。

依旧会被覆盖,我也尝试了用LMDeploy 部署,依旧失败 图像

请问这个问题解决了吗

isruihu commented 4 weeks ago

一样的问题,.cache/huggingface/modules/transformers_modules/build_mlp.py文件每次运行时候会刷新,导致无法从local加载openai/clip-vit-large-patch14-336 有人解决了吗

tianyour commented 4 weeks ago

您好,您的邮件我已收到,我将尽快查看!This is an automatic reply, confirming that your e-mail was received. Thank you

deku0818 commented 1 week ago

一样的问题,.cache/huggingface/modules/transformers_modules/build_mlp.py文件每次运行时候会刷新,导致无法从local加载openai/clip-vit-large-patch14-336 有人解决了吗

没有解决,已经放弃使用这个模型了

tianyour commented 1 week ago

您好,您的邮件我已收到,我将尽快查看!This is an automatic reply, confirming that your e-mail was received. Thank you