Closed chao1224 closed 7 years ago
Hey Shengchao!
On Tue, Aug 1, 2017 at 8:29 PM, Shengchao Liu notifications@github.com wrote:
@mkusner https://github.com/mkusner Hi Kusner, Have some quetions on the dimension of latent space.
As I checked it in the pretraind folder, and training scripts in GVAE https://github.com/mkusner/grammarVAE/blob/master/train_zinc.py#L21 and CVAE https://github.com/mkusner/grammarVAE/blob/master/train_zinc_str.py#L18, in your setting, you are using 56 as your latent dimension for both GVAE and CVAE. And in the paper appendix, you also plot the 2-dimensional space.
So two questions here:
- It seems like you are showing VAE on 2- and 56-dimensional space. Have you tried to optimize over other latent space dimension? Or you implemented that during Bayesian optimization?
- In Gomez's paper and code, he tried both 56 and 292 as latent space dimension. Have you also tried to run that results? Or you skip that because 56 is better than 292?
Please correct me if I miss something.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/mkusner/grammarVAE/issues/4, or mute the thread https://github.com/notifications/unsubscribe-auth/AIJS0Wue_Q3_voDrnbpXX_lsoshw9y44ks5sT3x-gaJpZM4OqLzk .
Thank you very much. Appreciate your help.
@mkusner Hi Kusner, Have some quetions on the dimension of latent space.
As I checked it in the pretraind folder, and training scripts in GVAE and CVAE, in your setting, you are using 56 as your latent dimension for both GVAE and CVAE. And in the paper appendix, you also plot the 2-dimensional space.
So two questions here:
Please correct me if I miss something.