jasonppy / VoiceCraft

Zero-Shot Speech Editing and Text-to-Speech in the Wild
Other
7.52k stars 739 forks source link

Batch Inference #143

Open nickmitchko opened 3 months ago

nickmitchko commented 3 months ago

Hi, Is it possible to batch inference like LLMs do? Such as provide 10 transcripts and batch the requests to increase total throughput?

jasonppy commented 3 months ago

TTS inference is in batch mode, it's just right now running the same transcript and selecting the shortest. Need some tweak in the code to support batch processing

thivux commented 1 month ago

isn't inference_tts_scale.py the way to do batch inference?