Islanna / DynamicReLU

Implementation of Dynamic ReLU on Pytorch
204 stars 38 forks source link

There may be a bug here. #7

Open L-Icarus opened 3 years ago

L-Icarus commented 3 years ago
def get_relu_coefs(self, x):
    print(x.shape)
    # axis?
    theta = torch.mean(x, dim=-1)
    if self.conv_type == '2d':
        # axis?
        theta = torch.mean(theta, dim=-1)
    theta = self.fc1(theta)
    theta = self.relu(theta)
    theta = self.fc2(theta)
    theta = 2 * self.sigmoid(theta) - 1
    return theta

My torch version is 1.1.0. torch.mean(input, dim, out=None) → Tensor

Islanna commented 3 years ago

Thanks! axis instead of dim is definitely a bug. But no worries, this doesn't affect performance, since pytorch just replaces a wrong keyword axis to correct dim and works as expected. If you want to fix this bug just make PR, I'll approve it.