Open Harini-Vemula-2382 opened 5 months ago
@Harini-Vemula-2382 Thank you. Llava ONNX export is not yet supported. A PR is open: https://github.com/huggingface/optimum/pull/1790
The error you get is likely
File "/home/felix/transformers/src/transformers/models/auto/auto_factory.py", line 566, in from_pretrained
raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers.models.llava.configuration_llava.LlavaConfig'> for this kind of AutoModel: AutoModelForCausalLM.
Model type should be one of BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, CamembertConfig, LlamaConfig, CodeGenConfig, CohereConfig, CpmAntConfig, CTRLConfig, Data2VecTextConfig, ElectraConfig, ErnieConfig, FalconConfig, FuyuConfig, GemmaConfig, GitConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, LlamaConfig, MambaConfig, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MistralConfig, MixtralConfig, MptConfig, MusicgenConfig, MusicgenMelodyConfig, MvpConfig, OpenLlamaConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PersimmonConfig, PhiConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, Qwen2Config, Qwen2MoeConfig, ReformerConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, Speech2Text2Config, StableLmConfig, Starcoder2Config, TransfoXLConfig, TrOCRConfig, WhisperConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig.
which stems from wrong task label on the Hugging Face Hub: https://huggingface.co/liuhaotian/llava-v1.6-34b/discussions/11 & https://huggingface.co/datasets/huggingface/transformers-metadata/blob/main/pipeline_tags.json#L440-L441
System Info
Who can help?
@michaelbenayoun @JingyaHuang @echarlaix I am writing to report an issue I encountered while attempting to export a Llava-1.5-7b model to ONNX format using Optimum.
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction (minimal, reproducible, runnable)
optimum-cli export onnx --model liuhaotian/llava-v1.5-7b llava_optimum_onnx/ --trust-remote-code
Expected behavior
I would expect Optimum to successfully export the Llava-1.5-7b model to ONNX format without encountering any errors or issues.