Open SuperbTUM opened 1 year ago
Hello, thanks for your hard work. I have a quick question while inspecting the backbone. Why is this line of Relu activation in Resnet deleted?
self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3, bias=False) self.bn1 = nn.BatchNorm2d(64) # self.relu = nn.ReLU(inplace=True) # add missed relu self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)
Hello, thanks for your hard work. I have a quick question while inspecting the backbone. Why is this line of Relu activation in Resnet deleted?