alecGraves / BVAE-tf

Disentangled Variational Auto-Encoder in TensorFlow / Keras (Beta-VAE)
The Unlicense
54 stars 13 forks source link

wrong implementation? #7

Closed iperov closed 1 year ago

iperov commented 5 years ago

I checked various pytorch repos, all of them have loss for mean and log_var value, but your have not. Also resampler formula wrong in your repo.

alecGraves commented 5 years ago

I guess I do not understand. This implementation does have a loss added for kl divergence using the mean and log_var / std dev, it is just added inside of the layer using Layer.add_loss().

Also, what is meant by resampler formula? reparameterization? I did find this difference:

return mean + K.exp(stddev) * epsilon

normally is this in other implementations:

return mean + K.exp(stddev/2) * epsilon

Anyway, thank you for contributing 😃 !!

Gregor0410 commented 5 years ago

In the bottom example, the stddev variable would represent log variance. Dividing it by two and putting it through the exp function simply converts log variance to standard deviation. If you want to use stddev instead of variance or log variance, it would just be return mean + stddev * epsilon

alecGraves commented 5 years ago

Thanks @Gregor0410, I corrected the output of the previous layer to be called logvar.