Open ChenDRAG opened 5 years ago
nn.Hardtanh(inplace=True), BinarizeConv2d(int(192*self.ratioInfl), int(384*self.ratioInfl), kernel_size=3, padding=1),
this is a sample code from alexnet binary.py, what i don't understand is since you already binarize the input in
BinarizeConv2d function,
so what is the point of using hardtanh activation?
To approximate sign gradients in the backward pass
this is a sample code from alexnet binary.py, what i don't understand is since you already binarize the input in
so what is the point of using hardtanh activation?