Open stes opened 1 month ago
TODO: Fix case where batch_size is None
@stes about what I implemented in #202 that I do see here.
I think it would be good to have a really basic function where you provide the loss and the batch size, so that it is easily usable in the pytorch implementation as well.
Also, it would be nice to test for the default CEBRA.batch_size = None
, not sure it is handled here.
This adds a better goodness of fit measure. Instead of the old variant which simply matched the InfoNCE and depends on the batch size, the proposed measure
The conversion is quite simply done via
This measure is also used in DeWolf et al., 2024, Eq. (43)
Application example (GoF improves from 0 to a larger value during training):
Close https://github.com/AdaptiveMotorControlLab/CEBRA-dev/pull/669