-
I'm trying to follow your http://int8.io/automatic-differentiation-machine-learning-julia/
Here's my take on it: https://github.com/gaika/AutoDiffSource.jl/blob/master/examples/mnist_autoencoder.jl…
ghost updated
7 years ago
-
Dear rjpg,
Is AutoEncoderMNIST.py a stacked autoencoder? Could you please provide clarification on that?
Thanks.
-
Currently, `clmbr_train_model` conceals more familiar training loops structure from users. In most demos and APIs, the boilerplate looks like what's outlined here https://github.com/PyTorchLightning/p…
-
Hello Again,
I was wondering if any insight could be provided regarding the steps in "feeding" other data sets into the proposed SNN? I got the MNIST set to work perfectly on the code (thank you f…
-
Hi,
Did you somehow figure out how it would be possible to create a auto encoder with RLS?
for example with the mnist dataset to remove noise OR create new numbers....
normaly the autoencoder do…
snapo updated
11 months ago
-
MNIST Results:
We used a standard variational autoencoder. We trained with inlier digits 1 and 3. Each model was trained with batch size of 128, where each image is an unnormalized grayscale image of…
-
I am trying to reproduce the basic autoencoder example from the Keras blog:
https://blog.keras.io/building-autoencoders-in-keras.html
```
# Define basic parameters
encoding_dim = 32
batch_siz…
EvanZ updated
5 years ago
-
Recently, I am studying one of your codes for adversarial-autoencoder. I am supervised at its excellent performance on mnist. Since the original paper also makes the experiment for svhn dataset. I don…
hldwc updated
7 years ago
-
Hi @lucidrains !
Do you have any idea/insight on how to supervise classification (let's say, for example, MNIST digits classification) after having trained GLOM in an unsupervised way as a denoising …
-
I tried to run the code for mnist. Something goes wrong with loading the images in the function:
def inf_train_gen(train_gen):
while True:
for i, (images, _) in enumerate(train_gen):
…