I want to implement shifted relu or SELU on the resnet_binary code. But when I change the code to use SELU or even ReLu I get the following error. Could you please give me some hints about what else I might have to change to replace hardtanh to SELU? Any pointers would be really appreciated.
Hi ,
I want to implement shifted relu or SELU on the resnet_binary code. But when I change the code to use SELU or even ReLu I get the following error. Could you please give me some hints about what else I might have to change to replace hardtanh to SELU? Any pointers would be really appreciated.
/Users/Desktop/BNN-Imagenet/models/resnet_binary.py(59)forward() -> residual = self.downsample(residual) (Pdb)