YannDubs / disentangling-vae

Experiments for understanding disentanglement in VAE latent representations
Other
785 stars 143 forks source link

[BUG] Changed the stored values for batchTC to dim-wise kl #39

Closed linesd closed 5 years ago

linesd commented 5 years ago
YannDubs commented 5 years ago

Is the KL that you have called dw_kl_loss_ different from the usual kl_loss_ that we calculate for other models?

the name is miseading it is teh kl between the aggregate posterior instead of the usal posterior. I.e. KL[q(z)||p(z)] instead of KL[q(z|x)||p(z)]

YannDubs commented 5 years ago

so it is different. But we do compute all the terms needed to compute the usual KL, maybe I'll just save the usual KL instead to not compare apple and oranges. Just check that the viz side doesn't brake with that naming then merge. I'll take care of the rest and rerun :)

YannDubs commented 5 years ago

please don't merge the PR, I'll take care of the rest (I had to change the loss file and will add those changes locally, then push all)