Closed alecGraves closed 5 years ago
I do not know if I should abs the stddev component of latent space or not... I think it breaks the loss function if it is negative?
# kl divergence: latent_loss = -0.5 * K.mean(1 + stddev - K.square(mean) - K.exp(stddev), axis=-1)
I do not know if I should abs the stddev component of latent space or not... I think it breaks the loss function if it is negative?