I tried to load the model with transformers.AutoModel.from_pretrained, but I got this error:
Exception has occurred: KeyError (note: full exception trace is shown but execution is paused at: _run_module_as_main)
'llava_mistral'
File "/home/wqruan/miniconda3/envs/llava-med/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 795, in __getitem__
raise KeyError(key)
File "/home/wqruan/miniconda3/envs/llava-med/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1098, in from_pretrained
config_class = CONFIG_MAPPING[config_dict["model_type"]]
File "/home/wqruan/miniconda3/envs/llava-med/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 526, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File "/home/wqruan/vlm/train.py", line 24, in <module>
model = AutoModel.from_pretrained(model_path)
File "/home/wqruan/miniconda3/envs/llava-med/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/home/wqruan/miniconda3/envs/llava-med/lib/python3.10/runpy.py", line 196, in _run_module_as_main (Current frame)
return _run_code(code, main_globals, None,
KeyError: 'llava_mistral'
I tried to load the model with transformers.AutoModel.from_pretrained, but I got this error:
Can anyone help? Thanks.