Open redBirdTx opened 5 years ago
Currently not. In terms of conditional optimization, we developed a new method called graph-to-graph translation that performs much better than running gradient ascent over VAE latent space. The paper and github link is here: https://github.com/wengong-jin/iclr19-graph2graph. We suggest you to use that for conditional optimization instead.
Dear Wengong,
Does molopt have a faster implementation as well? (Skipping pretrain and use the vae_train.py with annealing strategy)?
Thank you!