pytorch / ignite

High-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently.
https://pytorch-ignite.ai
BSD 3-Clause "New" or "Revised" License
4.5k stars 608 forks source link

Add divergence metrics #3232

Closed kzkadc closed 3 months ago

kzkadc commented 3 months ago

Description: added KL and JS divergences between two categorical predictions

Check list:

vfdev-5 commented 3 months ago

@kzkadc can you please check this failure with pytorch 1.5.1: https://github.com/pytorch/ignite/actions/runs/8730344048/job/23953994151

TypeError: kl_div() got an unexpected keyword argument 'log_target'

Looks like log_target arg was not added in 1.5.1, can you see what would be the effort to support 1.5.1. We have two options: a) fix ignite code if this is a minimal fix, b) drop support for 1.5.1 in our pytorch versions tests and replace it with a minimal later version. Option a) would be preferable but the fix should not be a large update of the code.

kzkadc commented 3 months ago

Thanks. The kl_div's behaviour in 1.5.1 is equivalent to log_target=False (default) in the current version. We can fix it by passing the target variable to kl_div as a probabilities.

vfdev-5 commented 3 months ago

we can also make a small if-else depending on pytorch version. For example: https://github.com/pytorch/ignite/blob/f431e60b09743dc8d99b7e5f32e234f46a2a920d/ignite/metrics/gan/fid.py#L15

Using log_target may be has better computation result stability (to confirm), so, for recent pytorch it would be better to use this option if it makes more sense to use vs keeping default log_target=False