sanghoon / pva-faster-rcnn

Demo code for PVANet
https://arxiv.org/abs/1611.08588
Other
651 stars 241 forks source link

about the order of the ReLU Layer and Dropout Layer in the prototxt? #23

Closed xiaoxiongli closed 7 years ago

xiaoxiongli commented 7 years ago

@sanghoon Dear sanghoon,

in the pvanet's training prototxt:

layer { name: "drop6" type: "Dropout" bottom: "fc6" top: "fc6" dropout_param { dropout_ratio: 0.5 } } layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6" }

in the faster-rcnn's training prototxt:

layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6" } layer { name: "drop6" type: "Dropout" bottom: "fc6" top: "fc6" dropout_param { dropout_ratio: 0.5 } }

I feel confuse that why in pvanet the relu layer is on the top of the dropout layer? ………… ^_^

sanghoon commented 7 years ago

Hi @xiaoxiongli,

It won't make any difference since dropout makes some randomly selected values to be 0 regardless of their original values.

xiaoxiongli commented 7 years ago

@sanghoon thank you very much for your reply, it is very clear.^_^