ShaoqingRen / SPP_net

SPP_net : Spatial Pyramid Pooling in Deep Convolutional Networks for Visual Recognition
364 stars 237 forks source link

Dropout ratio change for SPP caffe version not accounted in deployment #51

Closed legolas123 closed 9 years ago

legolas123 commented 9 years ago

Since in the SPP caffe version, scaling in dropout layer during training is not happening and it has been shifted to test phase, doesnt spp_poolX_to_fcX.m should scale the feat appropriately after the max(0,feat) operation.

ShaoqingRen commented 9 years ago

@legolas123, absolutely you are right. We realized this issue nearly when doing external tester. We experimentally verified the difference between w/o scale the feat, it will lead in around 0.5 changes on mAP. But considering, all of models in caffe ZOO is doing scaling in training, which is compatible with current spp code. So maybe current spp code is more suitable for caffe forks (expect ours).

legolas123 commented 9 years ago

Ok, that seems correct. Thanks