OpenBMB / MiniCPM-V

MiniCPM-Llama3-V 2.5: A GPT-4V Level Multimodal LLM on Your Phone
Apache License 2.0
7.86k stars 547 forks source link

batch inference #228

Closed berry-ding closed 3 weeks ago

berry-ding commented 1 month ago

hi, great work! how to implement the batch inference in offline environment.

BoyuGuan commented 1 month ago

I also want to know because I need to obtain the inference result of MiniCPM-Llama3-V-2_5 on my own dataset.

Cuiunbo commented 3 weeks ago

Hi, thanks a lot for your support, we are not trying to support batch input at the moment, because most of the time bs=1 can already take up a full resource. If you need batch infer, you are welcome to implement it by yourself.