NVIDIA / flowtron

Flowtron is an auto-regressive flow-based generative network for text to speech synthesis with control over speech variation and style transfer
https://nv-adlr.github.io/Flowtron
Apache License 2.0
887 stars 177 forks source link

Training on a smaller GPU? #119

Open brentcty-2020 opened 3 years ago

brentcty-2020 commented 3 years ago

Hi, is there anyway to train this on a smaller GPU setup for us mere mortals? On a 8G 2070 with a batch size set to 1, it still seems to run out of memory.

Thanks for any help.

deepglugs commented 3 years ago

I have to add this to the training loop to get it to fit on my 24G card (dataset dependant):

if txt.size(1) <= 1 or txt.size(1) > 550:
    continue

This essentially skips data with long utterances.

Muhyzater commented 3 years ago

It is less likely to face this issue when training from scratch, but also you can remove any utterance that is longer than 6 seconds.