Hi:
I want to use your warp-rnnt as the loss function to train my model. But i met the problem, that I don't know how to do backward. The output of rnnt_loss() is a cost and a grad, both of them are tensors. Can you give an example to show how to do backward. Thanks!
Neng
Hi: I want to use your warp-rnnt as the loss function to train my model. But i met the problem, that I don't know how to do backward. The output of rnnt_loss() is a cost and a grad, both of them are tensors. Can you give an example to show how to do backward. Thanks! Neng