abetlen / llama-cpp-python

Python bindings for llama.cpp
https://llama-cpp-python.readthedocs.io
MIT License
8.16k stars 970 forks source link

Add support of Qwen2vl #1811

Open PredyDaddy opened 3 weeks ago

PredyDaddy commented 3 weeks ago

Is your feature request related to a problem? Please describe. Currently I am using Qwen2vl, this is the best vlm model for my project. I hope llama-cpp-python can support this model. I tried to use llama.cpp to build a server, but llama.cpp is not allow to use mm-proj.

Describe the solution you'd like Support the qwen2vl model and use it like other vlm models.

zhouxihong1 commented 6 days ago

+1

fanfansoft commented 3 days ago

+1,please