def forward(self, input):
conv3X3 = self.conv3X3(input)
conv5X5_1 = self.conv5X5_1(input)
conv5X5 = self.conv5X5_2(conv5X5_1)
conv7X7_2 = self.conv7X7_2(conv5X5_1)
conv7X7 = self.conv7x7_3(conv7X7_2)
out = torch.cat([conv3X3, conv5X5, conv7X7], dim=1)
out = F.relu(out)
return out
I think the code is different from paper.
conv5X5_1 = self.conv5X5_1(input) should be changed to conv5X5_1 = self.conv5X5_1(conv3X3)
Could you please tell me why?
I think the code is different from paper. conv5X5_1 = self.conv5X5_1(input) should be changed to conv5X5_1 = self.conv5X5_1(conv3X3) Could you please tell me why?