THUDM / VisualGLM-6B

Chinese and English multimodal conversational language model | 多模态中英双语对话语言模型
Apache License 2.0
4.07k stars 414 forks source link

Error when loading local model: ValueError: Unrecognized configuration class <class 'transformers_modules.configuration_chatglm.ChatGLMConfig'> for this kind of AutoModel: AutoModelForCausalLM. #307

Closed Mingyu-Wei closed 10 months ago

Mingyu-Wei commented 10 months ago

I downloaded the model locally, and when I run this line, the error occurs.

model = AutoModelForCausalLM.from_pretrained(model_path, truse_remote_code=True)

I've looked at solutions in https://github.com/THUDM/ChatGLM-6B/issues/153 and https://github.com/THUDM/ChatGLM-6B/issues/37, but it doesn't seem to help.

I understand that the problem might be solved if I directly load the model from huggingface, but I think it's still a problem here

1049451037 commented 10 months ago

I think you should upgrade your transformers package.

pip install git+https://github.com/huggingface/transformers
Mingyu-Wei commented 10 months ago

I think you should upgrade your transformers package.

pip install git+https://github.com/huggingface/transformers

I've tried this, and I'm currently using transformers 4.35.0.dev. The problem still exists. Which version should I use?

1049451037 commented 10 months ago

Use AutoModel instead of AutoModelForCausalLM if you are using VisualGLM-6B.

Mingyu-Wei commented 10 months ago

Use AutoModel instead of AutoModelForCausalLM if you are using VisualGLM-6B.

Thanks, that is indeed the problem