Open Viscoo opened 6 years ago
The activations are binarised at the input to the next layer.
as for the final layer: my understanding is that whilst the convolution multiplications are binary (xnor operation), the summing is not:
y = W*X + B y = xnor(w1, x1) + xnor(w2, X2) ... + B
FYI theres another really good paper that has come out that discusses this a bit further and also has some big improvement on the original BinaryNet implemented here. It's a pretty small extension to BinaryNet, hopefully I'll get some time to do an implementation:
How to Train a Compact Binary Neural Network with High Accuracy? https://www.aaai.org/ocs/index.php/AAAI/AAAI17/paper/download/14619/14454
@yaysummeriscoming the results in this paper is impressive. When will you implement it? Thanks.
I have run the code successful, but when I check the neurons after the binaryconv layer,but it distribute dispersedly,and have no trend to -1 or 1。