ValueError: The checkpoint you are trying to load has model type `llava_qwen` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date. #213
when I run the following code:
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("lmms-lab/llava-next-interleave-qwen-7b")
I met the error:ValueError: The checkpoint you are trying to load has model type llava_qwen but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
when I run the following code: from transformers import AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained("lmms-lab/llava-next-interleave-qwen-7b")
I met the error:ValueError: The checkpoint you are trying to load has model type
llava_qwen
but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.My transformers is 4.40.0.dev0