zincware / ZnNL

A Python package for studying neural learning
Eclipse Public License 2.0
6 stars 1 forks source link

Redundant calculation of loss during training step #72

Open jhossbach opened 1 year ago

jhossbach commented 1 year ago

In this code, value_and_grad has the ability to both compute the loss and it's gradient at the same time. We can use that and remove the unnecessary second calculation of the loss in _compute_metrics.

https://github.com/zincware/ZnRND/blob/36b921aae1580ee4ec64a36219db77e9f3ad27d9/znrnd/models/jax_model.py#L137-L146

SamTov commented 9 months ago

@KonstiNik Did you resolve this at some stage during your restructures?