1ytic / warp-rnnt

CUDA-Warp RNN-Transducer
MIT License
211 stars 41 forks source link

Question about average_frames and reduction parmas #27

Open wl-junlin opened 2 years ago

wl-junlin commented 2 years ago

I want to have a stable loss which is rubust to labels_lengths when training. What value should I pass to this two parmas?

What's more, what is the approximate relationship between loss and actual wer? For example, if I want a wer aroud 0.5. How much should be the value of the loss?

1ytic commented 2 years ago

You shouldn't average over frames. If I remember correctly, theoretically it doesn't make sense. The loss is calculated for the entire utterance.

There is no a direct link between the RNN-T loss value and WER. I think a good analogue would be the negative log-likelihood and the accuracy of a classifier.