OFA-Sys / OFA

Official repository of OFA (ICML 2022). Paper: OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework
Apache License 2.0
2.39k stars 248 forks source link

Batch to Inference #440

Open GGital opened 4 months ago

GGital commented 4 months ago

Hello everyone, right now I am trying to inference image captioning using OFA/Huge fine-tuned on CoCo with about 48k images ,but I am facing very slow speed due to 1 image per batch ( about 1 image / sec which means I have to wait for about 13 hours to inference entire dataset). is there any way to do batch inference on my test set and still keeping beam search generation ?