OpenBMB / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
10 stars 3 forks source link

[Doc]: minicpmv2 inference #7

Open xiaohuihui52309 opened 2 months ago

xiaohuihui52309 commented 2 months ago

📚 The doc issue

How to use VLLM inference for fine-tuned minicpmv2 models

Suggest a potential alternative/fix

No response