zincware / ZnNL

A Python package for studying neural learning
Eclipse Public License 2.0
6 stars 1 forks source link

Remove `apply_softmax` in cross entropy loss class #70

Closed jhossbach closed 1 year ago

jhossbach commented 1 year ago

We use optax.softmax_cross_entropy for the cross entropy calculation which expects unnormalized log probabilities, so we would be applying the softmax twice if we initialize our cross entropy loss with apply_softmax=True

SamTov commented 1 year ago

Yes it is legacy code from when there was no optax usage.

SamTov commented 1 year ago

@jhossbach Is this resolved?

jhossbach commented 1 year ago

No, but I can create a PR for that.