import torch
from transformers import AutoModel, AutoTokenizer
torch.set_grad_enabled(False)
# init model and tokenizer
model = AutoModel.from_pretrained('internlm/internlm-xcomposer2-4khd-7b', torch_dtype=torch.bfloat16, trust_remote_code=True).cuda().eval()
Errors
OSError: We couldn't connect to 'https://hf-mirror.com' to load this file, couldn't find it in the cached files and it looks like internlm/internlm-xcomposer2-4khd-7b is not the path to a directory containing a file named configuration_internlm_xcomposer2.py.
models: https://huggingface.co/internlm/internlm-xcomposer2-4khd-7b
Reproduce code:
Errors
OSError: We couldn't connect to 'https://hf-mirror.com' to load this file, couldn't find it in the cached files and it looks like internlm/internlm-xcomposer2-4khd-7b is not the path to a directory containing a file named configuration_internlm_xcomposer2.py.
Thanks for your good work!