SeanNaren / warp-ctc

Pytorch Bindings for warp-ctc
Apache License 2.0
756 stars 271 forks source link

Fix batch and length normalization, add tests, replace distutils with setuptools #21

Closed jpuigcerver closed 6 years ago

jpuigcerver commented 6 years ago

In my previous PR there was a bug: the value of the normalized loss function was correct, but the gradients were those of the unnormalized version. This fixes that issue.

The user can select to normalize the loss (and gradients) w.r.t. the batch size (size_average=True) or w.r.t. the total number of frames in the batch (length_average=True). The latter overrides the former.

I updated the tests to prevent dummy mistakes like these.

In addition, Distutils support soon will be deprecated in pip, currently this warning is shown when one tries to uninstall the package with pip uninstall:

DEPRECATION: Uninstalling a distutils installed project (warpctc-pytorch) has been deprecated and will be removed in a future version. This is due to the fact that uninstalling a distutils project will only partially uninstall the project.
SeanNaren commented 6 years ago

Thanks man, I added a comment to the PR!

jpuigcerver commented 6 years ago

I can't find the comment...

SeanNaren commented 6 years ago

@jpuigcerver sorry forgot to publish the review...

SeanNaren commented 6 years ago

Thanks man :)