zhaoweicai / hwgq

Caffe implementation of accurate low-precision neural networks
Other
118 stars 33 forks source link

add HWGQ LeNet example: binary weights, 2-bit uniform HWGQ activations #1

Closed maltanar closed 7 years ago

maltanar commented 7 years ago

I thought MNIST was a more reasonable starting point for experimenting with quantization, so I made a HWGQ version of the LeNet from the Caffe tutorial. Using only binary weights (including first/last layers) and 2-bit uniform HWGQ activations, this trains in 30 seconds on a GTX1070 and reaches 99% accuracy.

zhaoweicai commented 7 years ago

hi @maltanar Thanks for providing the training scripts for LeNet on MINIST. The results look good. Could you make some changes to your training scripts, such that they are in the same formats as the provided examples on ImageNet? For example, the names and contents of the training scripts, and put those training scripts inside a folder. Just make the examples in this repository consistent. Then I can merge your branch.

maltanar commented 7 years ago

What's the convention for the training case names you've used, so I can follow the same pattern for LeNet? For instance, what do the names alex-hwgq-3ne-clip-poly-320k and alex-hwgq-2n-clip-step-160k indicate in your ImageNet training examples?

zhaoweicai commented 7 years ago

hi @maltanar Since there is only LeNet in MNIST dataset, I think you can simply name a folder as "LeNet-hwgq-3ne", put and rename the training scripts ("solver.prototxt", "train_lenet_mnist.sh" and "train_val.prototxt") into the folder. Also edit "train_lenet_mnist.sh" in the same way as I do. Thanks!

maltanar commented 7 years ago

Sure -- how does it look now?

lenet-hwgq-3ne/solver.prototxt lenet-hwgq-3ne/train_lenet_mnist.sh lenet-hwgq-3ne/train_val.prototxt

zhaoweicai commented 7 years ago

Looks good. I did the merge. Thanks!