Open Jun-Howie opened 1 month ago
what is the version of your transformers
?
transformers 4.45.0.dev0
Same question, transformers 4.45.0.dev0 accelerate-0.33.0
When I tried code from the Qwen/Qwen2-VL-7B-Instruct huggingface website top right command:
# Load model directly
from transformers import AutoProcessor, AutoModelForSeq2SeqLM
processor = AutoProcessor.from_pretrained("Qwen/Qwen2-VL-7B-Instruct")
model = AutoModelForSeq2SeqLM.from_pretrained("Qwen/Qwen2-VL-7B-Instruct")
I had a similar error for AutoModelForSeq2SeqLM: ValueError: Unrecognized configuration class <class 'transformers.models.qwen2_vl.configuration_qwen2_vl.Qwen2VLConfig'> for this kind of AutoModel: AutoModelForSeq2SeqLM. Model type should be one of BartConfig, BigBirdPegasusConfig, BlenderbotConfig, BlenderbotSmallConfig, EncoderDecoderConfig, FSMTConfig, GPTSanJapaneseConfig, LEDConfig, LongT5Config, M2M100Config, MarianConfig, MBartConfig, MT5Config, MvpConfig, NllbMoeConfig, PegasusConfig, PegasusXConfig, PLBartConfig, ProphetNetConfig, Qwen2AudioConfig, SeamlessM4TConfig, SeamlessM4Tv2Config, SwitchTransformersConfig, T5Config, UMT5Config, XLMProphetNetConfig.
My package version: transformers 4.45.0.dev0 accelerate 0.27.2
Qwen2VLForConditionalGeneration
can't be load from AutoModelForCausalLM
or AutoModelForSeq2SeqLM
.
Instead, it's under AutoModelForVision2Seq
according to the transformers
conventions.
you can use AutoModelForVision2Seq
as follows:
from transformers import AutoModelForVision2Seq
model = AutoModelForVision2Seq.from_pretrained("Qwen/Qwen2-VL-7B-Instruct")
Alternatively, you can directly use:
from transformers import Qwen2VLForConditionalGeneration
model = Qwen2VLForConditionalGeneration.from_pretrained("Qwen/Qwen2-VL-7B-Instruct")
@Jun-Howie It seems that the error in your code occurs in AutoModelForCausalLM.from_pretrained, rather than loading the model from Qwen2VLForConditionalGeneration. If your framework always loads models from AutoModelForCausalLM
(which is technically incorrect) and you cannot modify the framework's code, you can temporarily register Qwen2VLForConditionalGeneration
under AutoModelForCausalLM
:
# register before xxxx.from_pretrained
from transformers import AutoModelForCausalLM, Qwen2VLConfig, Qwen2VLForConditionalGeneration
AutoModelForCausalLM.register(config_class=Qwen2VLConfig, model_class=Qwen2VLForConditionalGeneration)
# now you can load Qwen2VLForConditionalGeneration from AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen2-VL-7B-Instruct")
However, please note that this approach does not conform to standard practices and should only be used temporarily.
The example here is wrong. (I am not sure if this is a bug of transformers wibsite.
已经从源码编译了 transformers
我的推理代码如下 def load(self): from transformers import Qwen2VLForConditionalGeneration, AutoTokenizer, AutoProcessor from transformers.generation import GenerationConfig
报错信息如下
File "/miniconda/envs/comfyui/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 567, in from_pretrained raise ValueError( ValueError: [address=0.0.0.0:33177, pid=22511] Unrecognized configuration class <class 'transformers.models.qwen2_vl.configuration_qwen2_vl.Qwen2VLConfig'> for this kind of AutoModel: AutoModelForCausalLM. Model type should be one of BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, CamembertConfig, LlamaConfig, CodeGenConfig, CohereConfig, CpmAntConfig, CTRLConfig, Data2VecTextConfig, DbrxConfig, ElectraConfig, ErnieConfig, FalconConfig, FalconMambaConfig, FuyuConfig, GemmaConfig, Gemma2Config, GitConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, GraniteConfig, JambaConfig, JetMoeConfig, LlamaConfig, MambaConfig, Mamba2Config, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MistralConfig, MixtralConfig, MptConfig, MusicgenConfig, MusicgenMelodyConfig, MvpConfig, NemotronConfig, OlmoConfig, OpenLlamaConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PersimmonConfig, PhiConfig, Phi3Config, PLBartConfig, ProphetNetConfig, QDQBertConfig, Qwen2Config, Qwen2MoeConfig, RecurrentGemmaConfig, ReformerConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, Speech2Text2Config, StableLmConfig, Starcoder2Config, TransfoXLConfig, TrOCRConfig, WhisperConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig.