Closed hulianyuyy closed 2 months ago
Hi, can you try to install llava first and see whether the error occur again?
Yes, i have installed llava-1.7.0-dev0 and install lmms-eval=0.2.1, and still see the problem. Here shows the information by conducting pip list
.
Can you try
from lmms_eval.models.llava_onevision import LlavaOneVision
and see what import error occurs?
If it is not in the models folder, then you might need to build from source
Thank you for your respones. The following error happens. llms-eval requires transformers=4.39.2, which already contains Qwen2moe_config, but it reports the error that can't import Qwen2moe_config. It seems that i need to install a higher version of transformer and reinstalll lmms-eval from source.
I have resintalled llava and lmms-eval from source, and updated transformers to 4.44.0, and still see the following error,
This is how I prepare envs
cd /path/to/lmms-eval
python3 -m pip install -e .;
cd /path/to/LLaVA-NeXT;
python3 -m pip install -e ".[train]";
# This could be optional
python3 -m pip install httpx==0.23.3 protobuf==3.20;
Check that this could run llava onevision model with current lmms-eval
and LLaVA-NeXT
code. Can you try pull again both repo and install again?
If you are encountering this error
Thank you for your respones. The following error happens. llms-eval requires transformers=4.39.2, which already contains Qwen2moe_config, but it reports the error that can't import Qwen2moe_config. It seems that i need to install a higher version of transformer and reinstalll lmms-eval from source.
Please go to hf and get access from llama. You might to export your huggingface token also.
Thanks. I have repulled the code and reinstalled the environment following your instructions. When i directly try `from lmms_eval.models.llava_onevision import LlavaOneVision, the llava_onevision model seems to exist, and it requests me to provide a huggingface token. But when i try evaluating the llava_onevision model following the instructions, it shows the following errors. It seems the llava_onevision model is not registered in the repo.
Please make sure that from lmms_eval.models.llava_onevision import LlavaOneVision
can be executed without error, otherwise this model will be skipped and will not add to the registry. export HF_TOKEN=xxx
can export your hf tokens in to the envs. Make sure you have been granted access to gated models
Thanks for your response. I have exported hugging face tokens. Directly importing LlavaOneVision by conducting from lmms_eval.models.llava_onevision import LlavaOneVision
raises the following error.
Llava and lmms-eval are built from source with the following versions.
Hi, I have noticed that there is a typo in the init file. Can you repull the repo and try again?
Thanks, the model can be loaded by conducting from lmms_eval.models.llava_onevision import Llava_OneVision
.
I have conducted
pip install git+https://github.com/EvolvingLMMs-Lab/lmms-eval.git
and successfully installed lmms-eval=0.2.1. But i got the following error:ValueError: Attempted to load model 'llava_onevision', but no model for this name found! Supported model names: batch_gpt4, claude, from_log, fuyu, gemini_api, gpt4v, instructblip, internvl, internvl2, llama_vid, llava, llava_hf, llava_sglang, longva, mantis, minicpm_v, phi3v, qwen_vl, qwen-vl-api, reka, srt_api, tinyllava, xcomposer2_4khd, xcomposer2d5
It can't findllava_onevision
model.