1ytic / warp-rnnt

CUDA-Warp RNN-Transducer
MIT License
211 stars 41 forks source link

how to apply backward to the output of rnnt_loss #10

Closed huangnengCSU closed 4 years ago

huangnengCSU commented 4 years ago

Hi: I want to use your warp-rnnt as the loss function to train my model. But i met the problem, that I don't know how to do backward. The output of rnnt_loss() is a cost and a grad, both of them are tensors. Can you give an example to show how to do backward. Thanks! Neng