Open Ashokvardhan opened 5 years ago
I just demonstrated how to change the model to BNN model on ResNet and vgg_cifar. You should replace the ReLU with HardTanH in all other models as well. Basically you must ensure that you have BNN and the HardTanH before every binary GEMM operation.
@itayhubara : I noticed that all the binarized neural network files
alexnet_binary.py, resnet_binary.py, vgg_cifar10_binary.py
have Hardtanh activation function whereas their respective parent architectures in the filesalexnet.py, resnet.py, vgg_cifar_10
have ReLU activation function. Is there any specific reason for this? However the Theano implementaion of Binary Connect code here uses ReLU activation when we binarize just the weights.