BAAI-DCAI / Bunny

A family of lightweight multimodal models.
Apache License 2.0
799 stars 61 forks source link

Batch inference #93

Open mtsysin opened 1 month ago

mtsysin commented 1 month ago

Hi!

I'm evaluating the model on a relatively large dataset (single question, single answer). I was able to fine-tune the Bunny-1.1-Llama-3-8B-V model using one of the scripts provided. What is the best strategy to implement batch inference?

Isaachhh commented 1 week ago

Sorry for that we don't support batch inference currently. You may split the dataset into multiple parts and launch a model on each GPU, like evaluating on VQA, GQA and SEED-Bench.