Closed pratapaprasanna closed 5 years ago
It is normal during the beginning of training due to the construction of ctc loss and the random initialization. If there are multiple warnings during the middle of training, then it is a sign that your model is diverging.
ok thanks @blisc
@blisc Is there a way to reduce it? Should we tune parameters or try to reduce our vocab?
HI all ,
I have trained the Openseq2seq speech2Text (jasper10x5_LibriSpeech_nvgrad.py) for two days and predictions went well.
Now i want to load this model and fire another training for which i have done only the following.
and i triggered training on the same dataset and now i am getting this in my logs
Following is the log:
Can anyone help me in understanding why im encountering this issue ? Any help would be off great use.
Thanks in advance.