Closed padmalcom closed 4 years ago
I set a large batch size in the script. You can try reducing that. I'll look into memory issues. Which repo version are you using? Does it include binned sampling already? This will reduce the memory consumption
Hi, just recognized that I had not updated my code for a while. I'll close this one and see if it works with the newest version.
Hi, I found time again to train a model on a larger German dataset. Creating the dataset works well but when I execute extract_durations.py the process is taking more and more RAM until the python process ends with a simple "killed". I have 32GB of RAM.
This is how I call it:
(TransformerTTS) [user1@localhost TransformerTTS]$ python extract_durations.py --config config/melgan --binary --fix_jumps --fill_mode_next --store_predictions
These are the last three lines I see: