Open Yumlembam opened 3 years ago
It is the Kullback–Leibler divergence between the z distribution and a standard Normal distribution. For more details look at this: https://towardsdatascience.com/understanding-variational-autoencoders-vaes-f70510919f73
Can you please reply what is kl loss kl_loss = 1 / x.size(0) * self.kl_loss() in the loss function. It will be very helpful.