Open QiaoZhennn opened 6 months ago
my transformers version is 4.39.2
Hi,
The model weights we've uploaded is formatted with transformers peft lora. Such that doesn't supports directly loading with this transformers auto loading code yet. To load our model, you should probably check out this function in our code for reference. Using this function you should be able to load the model with PeftLanguageModel.
By the way, If you wish to run demo, you could execute this script.
If you want to evaluate our model directly, you could start following instructions here and prepare the data, then execute this script
I try to load the model using this demo code. But it shows the following error. Wonder is there any example on how to run inference using hugging face?
Unrecognized configuration class <class 'transformers.models.llava.configuration_llava.LlavaConfig'> for this kind of AutoModel: AutoModelForSeq2SeqLM. Model type should be one of BartConfig, BigBirdPegasusConfig, BlenderbotConfig, BlenderbotSmallConfig, EncoderDecoderConfig, FSMTConfig, GPTSanJapaneseConfig, LEDConfig, LongT5Config, M2M100Config, MarianConfig, MBartConfig, MT5Config, MvpConfig, NllbMoeConfig, PegasusConfig, PegasusXConfig, PLBartConfig, ProphetNetConfig, SeamlessM4TConfig, SeamlessM4Tv2Config, SwitchTransformersConfig, T5Config, UMT5Config, XLMProphetNetConfig.