-
Idea is to have autoencoder trained on entire dataset and take its middle layer (embedding) as a feature to later train classifier on the train set. This is really time consuming and we'll have to use…
-
### References
- http://ruishu.io/2016/12/25/gmvae/
- http://nbviewer.jupyter.org/github/RuiShu/vae-clustering/blob/master/experiments.ipynb
- https://github.com/dpkingma/nips14-ssl
### Citations
- D…
-
Hi. Thanks for this great project. I am using floydhub to replicate results of this project. In step 2 of training variational autoencoder to calculate vectors in latent variable space after running t…
-
https://doi.org/10.1101/433763
> Recent advances in deep learning, particularly unsupervised approaches, have shown promise for furthering our biological knowledge through their application to gen…
-
Can you figure out a way to beat our current metrics (for low-level pipeline) of .456 (PixCorr) and .493 (SSIM) for subject 1? Use any method you can think of to try to improve upon the current approa…
-
Hi all !
First thanks for your awesome package !
Otherwise, do you plan handling gradients in stochastic computation graph, *i.e.* graph with conditional probability distributions such as
```ju…
-
-
Kingma, Diederik Pieter. [Variational inference & deep learning: A new synthesis](https://pure.uva.nl/ws/files/17891313/Thesis.pdf).
-
NLG, the other end of NLP, is important in many fields where AI is being applied. Please include the latest NLG research as well as imo it would be very helpful.
-
beta-VAE is also very good ref : http://openreview.net/forum?id=Sy2fzU9gl
Learning an interpretable factorised representation of the independent data gen- erative factors of the world without super…