OpenBMB / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
11 stars 5 forks source link

[Usage]: 请问是否支持batch 推理? #3

Open pandaGst opened 4 months ago

pandaGst commented 4 months ago

是否支持在minicpm_example.py 传入[prompt, ..., prompt], [image, image, image] , 从而实现batch推理。目前的默认脚本,看起来是一条一条的送进去推理的,推完一条再处理下一条?

HwwwwwwwH commented 4 months ago

暂不支持,不过最近应该会更新 Batch 推理