WZH0120 / SAM2-UNet

SAM2-UNet: Segment Anything 2 Makes Strong Encoder for Natural and Medical Image Segmentation
Apache License 2.0
172 stars 19 forks source link

No activation in BasicConv #6

Closed Callidior closed 3 months ago

Callidior commented 3 months ago

Hi, thanks for this repository! So far, SAM2UNet performs really well in my experiments.

I just came across some implementation detail that confused me a bit: The BasicConv block, of which sequences are used in the RBF module, has a ReLU child module but it is never used:

class BasicConv2d(nn.Module):
    def __init__(self, in_planes, out_planes, kernel_size, stride=1, padding=0, dilation=1):
        super(BasicConv2d, self).__init__()
        self.conv = nn.Conv2d(in_planes, out_planes,
                              kernel_size=kernel_size, stride=stride,
                              padding=padding, dilation=dilation, bias=False)
        self.bn = nn.BatchNorm2d(out_planes)
        self.relu = nn.ReLU(inplace=True)

    def forward(self, x):
        x = self.conv(x)
        x = self.bn(x)
        # SHOULD THERE BE `x = self.relu(x)` HERE ?
        return x

Was the activation omitted intentionally and, if so, what is the reason?

xiongxyowo commented 3 months ago

Hi, we exactly follow the RFB_modified design in PraNet and other popular salient/camouflaged object detection networks. For the original RFB in RFBNet, the ReLU is partially disabled. Unfortunately, the PraNet paper did not mention the motivation for this design difference about ReLU.

Callidior commented 3 months ago

Thanks for the quick answer! Glad to hear this is intentional :)