Closed EderSantana closed 8 years ago
This is cool!
My question is, is that interesting for the main repo? Should I PR as a new type of layer?
What arguments would you see for and against it?
I don't want to get Keras bloated, just making sure it is general enough to be here. I'm submitting PR in a bit.
any updates on the possibility of adding this layer? i think it would be vary welcome :)
i also saw this notebook the other day that has a similar implementation with tf https://jmetzen.github.io/2015-11-27/vae.html and could be adapted or imitated with keras.
Hi all,
I'm discovering keras and finding it great. Awesome work. @EderSantana I had a look at your VAE implementation, thanks for sharing ;)
A comment that comes to mind is: you created a VariationalDense layer that contains the last layer of weights + the sampling process (at training). And you regularize the (mu, sigma) with the gaussian KL term. I believe it would be simpler to have (instead of the VariationalDense layer) a layer that just does sampling, and apply the activity regularizer on a standard Dense layer, what do you think ?
I can suggest an implementation this way,
A
@drasros this is a good idea, I implemented it before this way. But note that you would need two dense layers, one for estimating the mean and another to estimate the variance. After that, you had to combine both multiplying the variance with noise. I believe you would need a Graph for that. Getting everything inside a layer was the way I found to make it behave like a single layer for a model. But if you implement it in another way, let me know and I can give you more feedback.
@EderSantana ok thanks, Well, I don't see why two dense layers are needed? The matrices W4 and W5 from the original article can be concatenated so I think it is possible for example to just have a Dense layer and use the first half its output activities as the means and the second half as variances...
If I implement it this way I'll let you know ;)
Cheers
@EderSantana since there seems to be plenty of interest, I think it would make sense to include it in the codebase (at least as an example). Would you like to make a PR?
@fchollet sure I'll do it
@EderSantana any news on this? Thanks
@fchollet , @farizrahman4u, @phreeza and others, here I wrote a variational layer for variational autoencoders or just a new type of regularization (that is better than plain L2 btw): https://github.com/EderSantana/seya/blob/master/examples/Convolutional%20Variational%20AE.ipynb
My question is, is that interesting for the main repo? Should I PR as a new type of layer?