Closed xiaoxiongli closed 7 years ago
@sanghoon Dear sanghoon,
in the pvanet's training prototxt:
layer { name: "drop6" type: "Dropout" bottom: "fc6" top: "fc6" dropout_param { dropout_ratio: 0.5 } } layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6" }
in the faster-rcnn's training prototxt:
layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6" } layer { name: "drop6" type: "Dropout" bottom: "fc6" top: "fc6" dropout_param { dropout_ratio: 0.5 } }
I feel confuse that why in pvanet the relu layer is on the top of the dropout layer? ………… ^_^
Hi @xiaoxiongli,
It won't make any difference since dropout makes some randomly selected values to be 0 regardless of their original values.
@sanghoon thank you very much for your reply, it is very clear.^_^
@sanghoon Dear sanghoon,
in the pvanet's training prototxt:
layer { name: "drop6" type: "Dropout" bottom: "fc6" top: "fc6" dropout_param { dropout_ratio: 0.5 } } layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6" }
in the faster-rcnn's training prototxt:
layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6" } layer { name: "drop6" type: "Dropout" bottom: "fc6" top: "fc6" dropout_param { dropout_ratio: 0.5 } }
I feel confuse that why in pvanet the relu layer is on the top of the dropout layer? ………… ^_^