Closed prhbrt closed 6 years ago
looking into it...
@faustomilletari I have similar problem from the right side blocks. eg: https://github.com/faustomilletari/VNet/blob/master/Prototxt/train_noPooling_ResNet_cinque.prototxt#L778
layer {
name: "addLayer2_4"
type: "Eltwise"
bottom: "conv_in16_chan128_right_3"
bottom: "concat_in16_concat"
top: "outBlock2_4"
eltwise_param {
operation: SUM
}
}
layer {
name: "outBlock2_4_RELU"
type: "PReLU"
bottom: "outBlock2_4"
top: "outBlock2_4"
}
In the paper outBlock2_4_RELU should be before addLayer2_4.
The PReLU of the first block seems to be annotated before the element-wise sum in the paper:
source: https://arxiv.org/abs/1606.04797
However, in the caffe-implementation it seems to go behind it:
https://github.com/faustomilletari/VNet/blob/master/Prototxt/train_noPooling_ResNet_cinque.prototxt#L96