casperkaae / parmesan

Variational and semi-supervised neural network toppings for Lasagne
Other
208 stars 31 forks source link

Replace bernoulli sampling with binarized_mnist dataset #10

Closed dribnet closed 9 years ago

dribnet commented 9 years ago

Updates iw_vae example, replacing the bernoulli resampling that is happening every epoch with the standard binarized_mnist dataset. The kerosene library was used because it logically a drop in replacement with binarized_mnist support.

Fixes #8.

(Will followup with results of training, which is in progress.)

dribnet commented 9 years ago

Here's a preview. Command from README:

python examples/iw_var.py -eqsamples 1 -iw_samples 1 -lr 0.001 -nhidden 500 -nlatent 100 -nonlin_dec very_leaky_rectify -nonlin_enc rectify

100 epoch result before this commit:

eval_l5000 (somewhat troubling it doesn't seem to match the graph on the home page)

100 epoch result after this commit:

eval_l5000

So even just after starting, there is a slight difference showing this change makes training more challenging as expected. Will now let the latter run to 10k epochs, though on my GTX 680 that looks to take about 36 hours.

casperkaae commented 9 years ago

Thanks for fixes. We saw that it was 1-2 NATS better to do he resampling every epoch in some earlier experiments. I think I'll add a switch to select for the fixed binarized or resampled MNIST dataset.

casperkaae commented 9 years ago

This is closed via #11. I wrote a simple loader for the binarized MNIST data since I prefer not having to many dependencies for our bundled examples.