-
I modified the example from the documentation for variational autoencoder in the website:
https://blog.keras.io/building-autoencoders-in-keras.html,
hoping to use more layers.
But it comple…
-
> On May 7, 2016, at 11:27 AM, Rick Farouni rfarouni@gmail.com wrote:
>
> Hi Dustin,
>
> Is it straightforward to implement importance weighted autoencoders in Edward?
>
> Thanks
> Rick Farouni,…
-
Hi,
I am using the variational auto encoder described in [http://blog.keras.io/building-autoencoders-in-keras.html](url).
My input shape is (22664, 678) and I have changed loss to "categorical_cro…
-
Thanks for an awesome set of tutorials! I was tinkering with this a bit, trying to merge the chapter on convolutional autoencoders with tied weights (09) and the variational autoencoder (11). The adap…
-
@twiecki @fonnesbeck
Did you guys see this workshop?
This workshop brings together developers of black box inference technologies, probabilistic programming systems, and connectionist computing fr…
-
Not sure I understand the backprop through LSTM timestep lines 110-112 in `train.lua`?
Any chance of an explanation? Thanks:)
-
The ADAM optimizer is now available in the torch `optim` module.
https://github.com/torch/optim/blob/master/adam.lua
Might be interesting to swap RMSProp for this