doomie / HessianFree

An implementation of the Hessian-free optimization algorithm in Theano
61 stars 23 forks source link

An open-source implementation of the Hessian-free optimization algorithm, based on the "Deep Learning via Hessian-free Optimization" paper by James Martens (ICML 2010, http://www.cs.toronto.edu/~jmartens/docs/Deep_HessianFree.pdf). The implementation is in Theano (http://deeplearning.net/software/theano), a highly efficient framework for defining and optimizing expressions with multi-dimensional arrays, and re-uses a lot of code from the Deep Learning Tutorials (http://www.deeplearning.net/tutorial).

The prerequisites for using the Hessian-free code are:

Once these pre-requisities are satisfied, you can run the code as follows:

python run_autoencoder.py -p jacobi -b 500 -e 50 -d mnist -o hf

this will load the MNIST dataset, perform 50 iterations of layer-wise RBM pre-training, followed by Hessian-free optimization of the global reconstruction cost using mini-batches of size 5000 and an (approximate) Jacobi pre-conditioner. Supervised learning follows.

The code is companion to the following paper:

Olivier Chapelle and Dumitru Erhan. Improved preconditioner for hessian free optimization. In NIPS Workshop on Deep Learning and Unsupervised Feature Learning, 2011. http://olivier.chapelle.cc/pub/precond.pdf

For questions and/or comments, contact Olivier Chapelle and Dumitru Erhan (usernames chap and dumitru on the yahoo-inc.com domain).