baidu-research / warp-ctc

Fast parallel CTC.
Apache License 2.0
4.06k stars 1.04k forks source link

API improvement of use_softmax and zero_infinity #180

Open NKNaN opened 6 months ago

NKNaN commented 6 months ago

From a suggestion proposed by PaddlePaddle community, it is recommended to add use_softmax and zero_infinity options to warpctc, comparing to the Pytorch API of torch.nn.functional.ctc_loss(log_probs, targets, input_lengths, target_lengths, blank=0, reduction='mean', zero_infinity=False).

I made an attempt to add this two options in this PR, and the logic is shown the following:

I also add the corresponding test for this two cases. (PS. The test of inf_test seems only to pass when the truncation processes are omited.)