LeeJunHyun / Image_Segmentation

Pytorch implementation of U-Net, R2U-Net, Attention U-Net, and Attention R2U-Net.
2.72k stars 600 forks source link

conv size of up-conv2x2 #35

Closed JohnTargaryen closed 5 years ago

JohnTargaryen commented 5 years ago

Hi, the paper says that "Every step in the expansive path consists of an upsampling of the feature map followed by a 2x2 convolution ('up-convolution') that halves the number of feature channels"

  1. In your code, however, I notice that you use a conv2d of size3x3, would you explain the reason to that?

    class up_conv(nn.Module):
    def __init__(self,ch_in,ch_out):
        super(up_conv,self).__init__()
        self.up = nn.Sequential(
            nn.Upsample(scale_factor=2),
            nn.Conv2d(ch_in,ch_out,kernel_size=3,stride=1,padding=1,bias=True),
            nn.BatchNorm2d(ch_out),
            nn.ReLU(inplace=True)
        )
    
    def forward(self,x):
        x = self.up(x)
        return x
  2. And also, let's say I want to stick to the paper and use 2x2 conv, how would I choose the padding and stride to make output size the same as the input size?

Thanks for sharing your work.

LeeJunHyun commented 5 years ago

Hi, @JohnTargaryen. Thank you for your interest in my repo. I just had wanted to compare a series of U-Net. So I adapted a same convolution filter size to models. If you want, please modify the convolution filter size. You can calculate the padding and the stride size by referring here. Thanks.

JohnTargaryen commented 5 years ago

Hi, @JohnTargaryen. Thank you for your interest in my repo. I just had wanted to compare a series of U-Net. So I adapted a same convolution filter size to models. If you want, please modify the convolution filter size. You can calculate the padding and the stride size by referring here. Thanks.

Thx, I'll look into that