jiecaoyu / XNOR-Net-PyTorch

PyTorch Implementation of XNOR-Net
480 stars 120 forks source link

ReLu Operation and Binarization function #89

Closed yaduydk97 closed 4 years ago

yaduydk97 commented 4 years ago

1.Are you using ReLu Activation function in every layer of AlexNet?? If so,what's its significance and where are we using it,Is it used just before maxpooling step ? 2.Are you not binarizing the input for 1st layer??,I think in all other layers,you are binarizing the input. 3.Please,be specific why you are not binarizing 1st layer,if so?? 4.Is batch nornalisation necessaryHow it is done and,where in the code are you doing that?

jiecaoyu commented 4 years ago

@yaduydk97

  1. I am applying ReLU after the Convolution. It is mentioned that We can insert a non-binary activation(e.g.,ReLU) after binary convolution. in the original paper (page.9).

  2. No. I didn't binarize the input for the first and last layer.

  3. It hurts accuracy a lot.

  4. Yes. It is the same as the conventional batch norm. BN is applied here.

yaduydk97 commented 4 years ago

Thank you @jiecaoyu