soapisnotfat / super-resolution

collection of super-resolution models & algorithms
Apache License 2.0
744 stars 188 forks source link

the model of EDSR don't have the bn layer #28

Open jzijin opened 4 years ago

jzijin commented 4 years ago

the model of EDSR not correct! Net( (input_conv): Conv2d(1, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) (residual_layers): Sequential( (0): ResnetBlock( (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) (bn): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True) (activation): ReLU(inplace=True) ) (1): ResnetBlock( (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) (bn): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True) (activation): ReLU(inplace=True) ) (2): ResnetBlock( (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) (bn): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True) (activation): ReLU(inplace=True) ) (3): ResnetBlock( (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) (bn): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True) (activation): ReLU(inplace=True) ) ) (mid_conv): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) (upscale_layers): Sequential( (0): PixelShuffleBlock( (conv): Conv2d(64, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) (ps): PixelShuffle(upscale_factor=2) ) (1): PixelShuffleBlock( (conv): Conv2d(64, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) (ps): PixelShuffle(upscale_factor=2) ) ) (output_conv): Conv2d(64, 1, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) ) the proposed EDSR model do not have the bn layer!