Open ThibaultGROUEIX opened 4 years ago
My setting is the same as the original paper. As you can see in the Decoder of model.py, the output is the parameters of the bernoulli distribution.
Thanks ! Sorry i didn't dare dive in the details of the ELBO loss, i am not familiar with it and I would need comments to follow through. I just saw a lot of gaussians in there so I assumed you were using a multivariate Gaussian distribution. Thanks again
Related question: I am trying the code on AFFNIST and it systematically crashes. It seems that z_sigma_2log
becomes too large here which gives NaN after the exponential. Do you have an idea why that might happen? Thanks again.
hi, I run your code in my computer, but I didn't get the good results as your figure (acc about 81%), I'm not sure whether I need to reset the parameters?
hi, I run your code in my computer, but I didn't get the good results as your figure (acc about 81%), I'm not sure whether I need to reset the parameters?
Hi, when I set the hid_dim as 50, the accuracy I got was is worse. I don't know what happened here? Any ideas?
@huyong1369 I'm running related experiments using this codes. The average accuracy is about 81% for mnist. Try to repeat the experiment to achieve better results.
Related question: I am trying the code on AFFNIST and it systematically crashes. It seems that
z_sigma_2log
becomes too large here which gives NaN after the exponential. Do you have an idea why that might happen? Thanks again.
Hello, this happened to me today. Were you able to solve this issue? Thank you.
No I did not solve it. As you can see from table 2 here, we reported the divergence on problem on Affnist, but the code ran on our other toy datasets.
Hi @GuHongyang , Thanks for the reimplementation! they mention in the paper that
As for the generative process in Section 3.1, the multivariate Bernoulli distribution is used for MNIST dataset, and the multivariate Gaussian distribution is used for the others
. In your code, you've implementedthe multivariate Gaussian distribution
right? Sincerely, Thibault