Hi there,
Thank you so much for sharing this fabulous work!
I'm a bit confused with the KL divergence loss calculation here.
As far as I understand, _postpr holds the posterior probability values, instead of the log of probabilities. While the KLDivLoss takes log of probabilities, should we actually send in _logposterior ?
Hi there, Thank you so much for sharing this fabulous work!
I'm a bit confused with the KL divergence loss calculation here. As far as I understand, _postpr holds the posterior probability values, instead of the log of probabilities. While the KLDivLoss takes log of probabilities, should we actually send in _logposterior ?