MatthieuCourbariaux / BinaryNet

Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1
BSD 3-Clause "New" or "Revised" License
1.04k stars 346 forks source link

How to revise it to be only activation is binary? #18

Closed zxzhijia closed 7 years ago

zxzhijia commented 7 years ago

I'm wondering whether it is easy to revise the code to be only binarized activation without binary weights?

MatthieuCourbariaux commented 7 years ago

You might want to change the following lines: https://github.com/MatthieuCourbariaux/BinaryNet/blob/master/Train-time/cifar10.py#L48 https://github.com/MatthieuCourbariaux/BinaryNet/blob/master/Train-time/cifar10.py#L57

zxzhijia commented 7 years ago

I see. So should I just set W_LR_scale = 1? (the second line you mentioned)

zxzhijia commented 7 years ago

In addition, I tried to change the activation function to be relu instead of sigmoid or tanh. But it seems doesn't work well. Below is the code I changed:

def hard_relu(x): return T.clip(x,0,1)

def binary_relu_unit(x): return round3(hard_relu(x))