OpenBMB / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
9 stars 3 forks source link

[Bug]: ```text The output of `python minicpmv_example.py` ``` File "/data/envs/jh_llava/lib/python3.10/site-packages/vllm/model_executor/model_loader/utils.py", line 35, in get_model_architecture ValueError: Model architectures ['MiniCPMV'] are not supported for now. Supported architectures: 这个问题是环境版本的问题吗? 如果是的话能提供一下正确的版本吗 #1

Open zjh908491372 opened 2 months ago

zjh908491372 commented 2 months ago

Your current environment

The output of `python collect_env.py`

🐛 Describe the bug

The output of `python minicpmv_example.py`

File "/data/envs/jh_llava/lib/python3.10/site-packages/vllm/model_executor/model_loader/utils.py", line 35, in get_model_architecture ValueError: Model architectures ['MiniCPMV'] are not supported for now. Supported architectures: 这个问题是环境版本的问题吗? 如果是的话能提供一下正确的版本吗

128Ghe980 commented 1 month ago

same problem

HwwwwwwwH commented 1 month ago

At that time, maybe you should checkout to the minicpmv branch. And for now, we've got our code merged into offical vllm repo. You can glone their code(also I've updated this repo) by:

git clone git@github.com:vllm-project/vllm.git

or just pull the latest commit of this repo in main branch.