YongfeiYan / Gumbel_Softmax_VAE

PyTorch implementation of a Variational Autoencoder with Gumbel-Softmax Distribution
202 stars 37 forks source link

KL divergence term #6

Open izaskr opened 5 years ago

izaskr commented 5 years ago

I was wondering what exactly this line in the KLD calculation does: log_ratio = torch.log(qy * categorical_dim + 1e-20)

In the definition of the ELBO loss, the KLD should be computed between the variational distribution q(z|x) and the prior p(z). How come you did not simply use the pytorch implementation of KLD (kl_div)?

patrick-g-zhang commented 4 years ago

Hi, I think it is the KL div between uniform distribution and posterior.