I use mxnet(http://mxnet.io/) to predict 68 facial feature points which is used in 300W (http://ibug.doc.ic.ac.uk/resources/300-W/).
$ cd prepare_data
$ python collect_augmented_images.py
Then, training_data.lst and test_data.lst are created.
$ cd prepare_data
$ sh im2rec.sh
Then, training_data.rec, test_data.rec are created.
$ cd ../mxnet
$ sh train_net.sh
You can see several parameters in train_net.sh.
In this repository, I added two networks : vgg16 (mxnet/symbols/vgg_16_reduced.py) and inception with batch normalization (mxnet/symbols/inception_bn.py)
$ python draw_result.py
I assume that you trained vgg16 so that the related files(vgg_16_reduced-0050.params, vgg_16_reduced-symbol.json) are located in 'mxnet'.
$ python test_net.py
These are some examples.
$ python predict.py