coqui-ai / TTS

🐸💬 - a deep learning toolkit for Text-to-Speech, battle-tested in research and production
http://coqui.ai
Mozilla Public License 2.0
33.25k stars 4.02k forks source link

[Feature request] Batch inference onnx Model #3814

Open phamkhactu opened 2 months ago

phamkhactu commented 2 months ago

Thanks for your excellent work.

I see that onnx model (for example Vit converted to onnx) is potential if it can inference with batch inputs because of reducing time and boosting performance.

Now I only inference model with batch_size=1, if greater >1, I get error. Would you mind helping me inference with batch_size >1

Thanks Tu

TashaSkyUp commented 1 month ago

Yes please add this.