Closed dribnet closed 9 years ago
Here's a preview. Command from README:
python examples/iw_var.py -eqsamples 1 -iw_samples 1 -lr 0.001 -nhidden 500 -nlatent 100 -nonlin_dec very_leaky_rectify -nonlin_enc rectify
100 epoch result before this commit:
(somewhat troubling it doesn't seem to match the graph on the home page)
100 epoch result after this commit:
So even just after starting, there is a slight difference showing this change makes training more challenging as expected. Will now let the latter run to 10k epochs, though on my GTX 680 that looks to take about 36 hours.
Thanks for fixes. We saw that it was 1-2 NATS better to do he resampling every epoch in some earlier experiments. I think I'll add a switch to select for the fixed binarized or resampled MNIST dataset.
This is closed via #11. I wrote a simple loader for the binarized MNIST data since I prefer not having to many dependencies for our bundled examples.
Updates iw_vae example, replacing the bernoulli resampling that is happening every epoch with the standard binarized_mnist dataset. The kerosene library was used because it logically a drop in replacement with binarized_mnist support.
Fixes #8.
(Will followup with results of training, which is in progress.)