k2-fsa / fast_rnnt

A torch implementation of a recursion which turns out to be useful for RNN-T.
Other
136 stars 22 forks source link

added return_grad for all types of rnnt loss #29

Closed durson closed 10 months ago

durson commented 10 months ago

Grad tensors can be useful for the full (ordinary and pruned) rnnt losses.

pkufool commented 10 months ago

Can you also fix this issue (https://github.com/k2-fsa/fast_rnnt/pull/25#issuecomment-1641313800), thanks!

durson commented 10 months ago

I fixed #25 (comment) However I preserved S > 0 requirement since otherwise it is not compatible with torchaudio and unittest was not passing.

Summary of the changes:

pkufool commented 10 months ago

@durson I have added the github actions to this repo, could you sync the PR with the latest master, thanks!

durson commented 10 months ago

Merged latest master to fork and tests are passing.

pkufool commented 10 months ago

Thanks! merging! I would be very nice if you can also make a PR to https://github.com/k2-fsa/k2 (k2/python/k2/rnnt_loss.py).