OpenBMB / MiniCPM-V

MiniCPM-V 2.6: A GPT-4V Level MLLM for Single Image, Multi Image and Video on Your Phone
Apache License 2.0
12.78k stars 893 forks source link

batch inference #228

Closed berry-ding closed 5 months ago

berry-ding commented 6 months ago

hi, great work! how to implement the batch inference in offline environment.

BoyuGuan commented 6 months ago

I also want to know because I need to obtain the inference result of MiniCPM-Llama3-V-2_5 on my own dataset.

Cuiunbo commented 5 months ago

Hi, thanks a lot for your support, we are not trying to support batch input at the moment, because most of the time bs=1 can already take up a full resource. If you need batch infer, you are welcome to implement it by yourself.