Closed pavelxx1 closed 3 years ago
Yes. It is normal. The loss used for training waveflow is negative log probability density. Probability density could have a value larger than one, thus negative log probability density would have a negative value. In our previous experiment, the loss could go down to -4.7 to -5.2.
Hi, first -- thx for your repo) I have a question: Loss has negative values after 50 steps, its normal?? I use LJSpeech-1.1 dataset from repo example
https://prnt.sc/10w9knw part of training log: