Open mrcslws opened 5 years ago
Agree.
Actually, No activation or weights equal to 0. I already checked it.
I faced the same issue and in my case weights and activations with a value of 0 appear quite often.
Replacing the tensor.sign()
with torch.where(tensor >= 0, 1., -1.)
does the trick
This code uses
tensor.sign()
to binarize the activations and weights. https://github.com/itayhubara/BinaryNet.pytorch/blob/f5c3672dede608f568e073a583cadd7a8a88fa9d/models/binarized_modules.py#L13The desired behavior is to always return -1 or 1, but
sign()
returns 0 for values that are 0.Batch normalization makes 0 less probable, but it can still happen. The code should probably force every activation to be either -1 or 1.