If you find ReBNet useful, please cite the ReBNet paper:
@inproceedings{finn,
author = {Mohammad Ghasemzadeh, Mohammad Samragh, Farinaz Koushanfar},
title = {ReBNet: Residual Binarized Neural Network},
booktitle = {Proceedings of the 26th IEEE International Symposium on Field-Programmable Custom Computing Machines},
series = {FCCM '18},
year = {2018}
}
The repo is organized as follows:
training-software: contains python codes for training ReBNet.
bnn: contains the FPGA hardware implementation.
For training RebNet, you should have the following packages installed:
Use Tensorflow backend for MNIST, CIFAR-10, and SVHN. Use MXNET backend for Imagenet.
dataset="MNIST"
Train=True
Evaluate=False
python Binary.py
dataset="CIFAR-10"
Train=True
Evaluate=False
python Binary.py
dataset="SVHN"
Train=True
Evaluate=False
python Binary.py
For speedup of Imagenet training, you need to uninstall your Keras and install an older version of it. If you do not wish to do that, you need to write the training script yourself since our code works with the older version of Keras.
args.data_train='/home/hamid/imagenet/train.rec'
args.data_val='/home/hamid/imagenet/val.rec'
"backend": "tensorflow" -> "backend": "mxnet"
python Binary.py --batch-per-gpu 64 --num-gpus 4
Train=False
Evaluate=True
python Binary.py
We are providing the pretrained weights in "models/DATASET/x_residuals.h5" with x being the number of levels in residual binarization. These weights will be replaced by your trained weights in case you train the models from scratch.
In order to rebuild the hardware designs, the repo should be cloned in a machine with installation of the Vivado Design Suite (tested with 2017.1). Following the step-by-step instructions:
clone_path/ReBNet/bnn/src/network/
clone_path/ReBNet/bnn/src/
./make-hw.sh {network} {platform} {mode}
where:
h
to launch Vivado HLS synthesis, b
to launch the Vivado project (needs HLS synthesis results), a
to launch both.clone_path/ReBNet/bnn/src/network/output/
that is organized as follows: