EvolvingLMMs-Lab / lmms-eval

Accelerating the development of large multimodal models (LMMs) with lmms-eval
https://lmms-lab.github.io/
Other
1.87k stars 143 forks source link

"llava_hf" is not registered #78

Open kyleliang919 opened 6 months ago

kyleliang919 commented 6 months ago

When running with --model llava_hf, doesn't seem to be properly registered and recognized as a model type

Attempted to load model 'llava_hf', but no model for this name found! Supported model names: llava, llava_sglang, qwen_vl, fuyu, gpt4v, instructblip, minicpm_v

My llms_eval version is as follows:

Name: lmms_eval Version: 0.1.2 Summary: A framework for evaluating large multi-modality language models Home-page: Author: Author-email: LMMMs-Lab Evaluation Team lmms_eval@outlook.com License: MIT

YitaoLiu1996 commented 6 months ago

Have you checked the AVAILABLE_MODELS dict in lmms_eval/models/init.py?

kyleliang919 commented 6 months ago

yes, llava_hf is listed in the init.py. I didn't modify that part of code.

kcz358 commented 5 months ago

You might want to print out the error when trying to import llava_hf. It is likely an ImportError. I can't reproduce this error using the main branch.

kangreen0210 commented 4 months ago

Perhaps this is due to a package import error, such as the Transformers version not matching, causing the lava_hf.py not to be properly parsed. this works for me

codefanw commented 4 months ago

I encountered the same issue and resolved it by changing the version of the transformer.