Closed brb-chen closed 4 years ago
when i use resnet34 and vgg16 as backbone, the middle layer shape seems not right?
data (InputLayer) (None, 224, 224, 3) 0
bn_data (BatchNormalization) (None, 224, 224, 3) 9 data[0][0]
zero_padding2d_1 (ZeroPadding2D (None, 224, 230, 9) 0 bn_data[0][0]
conv0 (Conv2D) (None, 64, 112, 2) 702464 zero_padding2d_1[0][0]
bn0 (BatchNormalization) (None, 64, 112, 2) 8 conv0[0][0]
relu0 (Activation) (None, 64, 112, 2) 0 bn0[0][0]
zero_padding2d_2 (ZeroPadding2D (None, 64, 114, 4) 0 relu0[0][0]
pooling0 (MaxPooling2D) (None, 64, 56, 1) 0 zero_padding2d_2[0][0]
stage1_unit1_bn1 (BatchNormaliz (None, 64, 56, 1) 4 pooling0[0][0]
stage1_unit1_relu1 (Activation) (None, 64, 56, 1) 0 stage1_unit1_bn1[0][0]
zero_padding2d_3 (ZeroPadding2D (None, 64, 58, 3) 0 stage1_unit1_relu1[0][0]
stage1_unit1_conv1 (Conv2D) (None, 64, 56, 1) 36864 zero_padding2d_3[0][0]
stage1_unit1_bn2 (BatchNormaliz (None, 64, 56, 1) 4 stage1_unit1_conv1[0][0]
stage1_unit1_relu2 (Activation) (None, 64, 56, 1) 0 stage1_unit1_bn2[0][0]
zero_padding2d_4 (ZeroPadding2D (None, 64, 58, 3) 0 stage1_unit1_relu2[0][0]
stage1_unit1_conv2 (Conv2D) (None, 64, 56, 1) 36864 zero_padding2d_4[0][0]
stage1_unit1_sc (Conv2D) (None, 64, 56, 1) 4096 stage1_unit1_relu1[0][0]
add_1 (Add) (None, 64, 56, 1) 0 stage1_unit1_conv2[0][0] stage1_unit1_sc[0][0]
My anaconda configuration: python 3.6 tensorflow 1.6.0 keras: 2.2.2
Seems wired ?
problem solved: channel order issue: in the resnet model build script the default channel order is "channel last", however it still process input data as "channel first order"
when i use resnet34 and vgg16 as backbone, the middle layer shape seems not right?
Layer (type) Output Shape Param # Connected to
data (InputLayer) (None, 224, 224, 3) 0
bn_data (BatchNormalization) (None, 224, 224, 3) 9 data[0][0]
zero_padding2d_1 (ZeroPadding2D (None, 224, 230, 9) 0 bn_data[0][0]
conv0 (Conv2D) (None, 64, 112, 2) 702464 zero_padding2d_1[0][0]
bn0 (BatchNormalization) (None, 64, 112, 2) 8 conv0[0][0]
relu0 (Activation) (None, 64, 112, 2) 0 bn0[0][0]
zero_padding2d_2 (ZeroPadding2D (None, 64, 114, 4) 0 relu0[0][0]
pooling0 (MaxPooling2D) (None, 64, 56, 1) 0 zero_padding2d_2[0][0]
stage1_unit1_bn1 (BatchNormaliz (None, 64, 56, 1) 4 pooling0[0][0]
stage1_unit1_relu1 (Activation) (None, 64, 56, 1) 0 stage1_unit1_bn1[0][0]
zero_padding2d_3 (ZeroPadding2D (None, 64, 58, 3) 0 stage1_unit1_relu1[0][0]
stage1_unit1_conv1 (Conv2D) (None, 64, 56, 1) 36864 zero_padding2d_3[0][0]
stage1_unit1_bn2 (BatchNormaliz (None, 64, 56, 1) 4 stage1_unit1_conv1[0][0]
stage1_unit1_relu2 (Activation) (None, 64, 56, 1) 0 stage1_unit1_bn2[0][0]
zero_padding2d_4 (ZeroPadding2D (None, 64, 58, 3) 0 stage1_unit1_relu2[0][0]
stage1_unit1_conv2 (Conv2D) (None, 64, 56, 1) 36864 zero_padding2d_4[0][0]
stage1_unit1_sc (Conv2D) (None, 64, 56, 1) 4096 stage1_unit1_relu1[0][0]
add_1 (Add) (None, 64, 56, 1) 0 stage1_unit1_conv2[0][0]
stage1_unit1_sc[0][0]
My anaconda configuration: python 3.6 tensorflow 1.6.0 keras: 2.2.2
Seems wired ?