Closed deku0818 closed 2 months ago
You may (1) download the xcomposer2-vl-7b
HF repo into your local path, (2) modify the build_mlp.py in your local repo and (3)replace the model = AutoModel.from_pretrained('internlm/internlm-xcomposer2-vl-7b')
with model = AutoModel.from_pretrained('xxx')
, where xxx
is your local model path.
You may (1) download the HF repo into your local path, (2) modify the build_mlp.py and (3)replace the
model = AutoModel.from_pretrained('internlm/internlm-xcomposer2-4khd-7b')
withmodel = AutoModel.from_pretrained('xxx')
, wherexxx
is your local model path.您可以(1)将 HF 仓库下载到本地路径,(2)修改 build_mlp.py,并(3)将model = AutoModel.from_pretrained('internlm/internlm-xcomposer2-4khd-7b')
替换为model = AutoModel.from_pretrained('xxx')
,其中xxx
是您的本地模型路径。
我指的是openai/clip-vit-large-patch14-336,我在运行internlm-xcomposer2-vl-7b报错了
raise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like openai/clip-vit-large-patch14-336 is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
Please download the xcomposer2-vl-7b HF repo into your local path, otherwise, your modification will be overwritten.
Please download the xcomposer2-vl-7b HF repo into your local path, otherwise, your modification will be overwritten.请将 xcomposer2-vl-7b HF 仓库下载到您的本地路径,否则,您的修改将被覆盖。
是的,做上述操作之前我已经下载到了本地路径..
Please download the xcomposer2-vl-7b HF repo into your local path, otherwise, your modification will be overwritten.请将 xcomposer2-vl-7b HF 仓库下载到您的本地路径,否则,您的修改将被覆盖。
依旧会被覆盖,我也尝试了用LMDeploy 部署,依旧失败
请将 xcomposer2-vl-7b HF 仓库下载到您的本地路径,否则,您的修改将被覆盖。
依旧会被覆盖,我也尝试了用LMDeploy 部署,依旧失败
请问这个问题解决了吗
一样的问题,.cache/huggingface/modules/transformers_modules/build_mlp.py文件每次运行时候会刷新,导致无法从local加载openai/clip-vit-large-patch14-336 有人解决了吗
您好,您的邮件我已收到,我将尽快查看!This is an automatic reply, confirming that your e-mail was received. Thank you
一样的问题,.cache/huggingface/modules/transformers_modules/build_mlp.py文件每次运行时候会刷新,导致无法从local加载openai/clip-vit-large-patch14-336 有人解决了吗
没有解决,已经放弃使用这个模型了
您好,您的邮件我已收到,我将尽快查看!This is an automatic reply, confirming that your e-mail was received. Thank you
我修改了文件 vim /root/.cache/huggingface/modules/transformers_modules/build_mlp.py 并且将vision_tower = 'openai/clip-vit-large-patch14-336' 替换为了本地路径,但是再次执行就被重写覆盖了