roggirg / AutoBots

103 stars 23 forks source link

Question on KL div loss #8

Closed sqchai closed 2 years ago

sqchai commented 2 years ago

Hi there, Thank you so much for sharing this fabulous work!

I'm a bit confused with the KL divergence loss calculation here. As far as I understand, _postpr holds the posterior probability values, instead of the log of probabilities. While the KLDivLoss takes log of probabilities, should we actually send in _logposterior ?

sqchai commented 2 years ago

nvm, found that KLDivLoss accepts non-log target when log_target is default to False :)