OpenBMB / MiniCPM

MiniCPM-2B: An end-side LLM outperforming Llama2-13B.
Apache License 2.0
4.38k stars 313 forks source link

[Feature Request]: Running multimodal on llama.cpp? #129

Open yyyoungman opened 2 months ago

yyyoungman commented 2 months ago

Feature request / 功能建议

Seems only the text part is supported on llama.cpp, but the multimodal part is not. It would be really helpful if it could be supported, because llama.cpp is so widely used now (much more use than mlc) and easy to deploy on edge devices. Would also greatly improve the impact of this work. Thanks!

LDLINGLINGLING commented 1 week ago

We forked the code of llama.cpp: https://github.com/OpenBMB/llama.cpp/tree/minicpm-v2.5/examples/minicpmv Here you can learn how to use llama.cpp to use our multi-moda, Hope this is useful to you