Closed psyhtest closed 7 years ago
As noticed by @milakov, it's quite surprising to see the following 1x1 convolutional layer having the padding of 1:.
layer { name: "conv10" type: "Convolution" bottom: "fire9/concat" top: "conv10" convolution_param { num_output: 1000 pad: 1 kernel_size: 1 } }
Any comments?
Duplicate issue. See #14
@forresti Thanks! (I really should have guessed this as a compiler person used to generating correct if redundant code :))
Haha yup, that's exactly what happened. :)
As noticed by @milakov, it's quite surprising to see the following 1x1 convolutional layer having the padding of 1:.
Any comments?