Closed yaduydk97 closed 4 years ago
@yaduydk97
I am applying ReLU after the Convolution
. It is mentioned that We can insert a non-binary activation(e.g.,ReLU) after binary convolution.
in the original paper (page.9).
No. I didn't binarize the input for the first and last layer.
It hurts accuracy a lot.
Yes. It is the same as the conventional batch norm. BN is applied here.
Thank you @jiecaoyu
1.Are you using ReLu Activation function in every layer of AlexNet?? If so,what's its significance and where are we using it,Is it used just before maxpooling step ? 2.Are you not binarizing the input for 1st layer??,I think in all other layers,you are binarizing the input. 3.Please,be specific why you are not binarizing 1st layer,if so?? 4.Is batch nornalisation necessaryHow it is done and,where in the code are you doing that?