OpenBMB / MiniCPM-V

MiniCPM-V 2.6: A GPT-4V Level MLLM for Single Image, Multi Image and Video on Your Phone
Apache License 2.0
11.91k stars 840 forks source link

[BUG] <MiniCPMv2.5是否支持vllm 推理框架> #330

Closed weiminw closed 1 month ago

weiminw commented 2 months ago

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

当前行为 | Current Behavior

No response

期望行为 | Expected Behavior

No response

复现方法 | Steps To Reproduce

No response

运行环境 | Environment

- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):

备注 | Anything else?

No response

LDLINGLINGLING commented 2 months ago

目前还在研发中,敬请期待

iceflame89 commented 2 months ago

@weiminw MiniCPM-Llama3-V 2.5 support vLLM now,see https://github.com/OpenBMB/MiniCPM-V?tab=readme-ov-file#vllm