I see that onnx model (for example Vit converted to onnx) is potential if it can inference with batch inputs because of reducing time and boosting performance.
Now I only inference model with batch_size=1, if greater >1, I get error. Would you mind helping me inference with batch_size >1
Thanks for your excellent work.
I see that onnx model (for example Vit converted to onnx) is potential if it can inference with batch inputs because of reducing time and boosting performance.
Now I only inference model with batch_size=1, if greater >1, I get error. Would you mind helping me inference with batch_size >1
Thanks Tu