Closed ducha-aiki closed 8 years ago
Hi, I have noticed that you put ReLU after classifier, which is not a common practice. Is there some reason for it?
layer { name: "conv10" type: "Convolution" bottom: "fire9/concat" top: "conv10" convolution_param { num_output: 1000 kernel_size: 1 weight_filler { type: "gaussian" mean: 0.0 std: 0.01 } } } layer { name: "relu_conv10" type: "ReLU" bottom: "conv10" top: "conv10" } layer { name: "pool10" type: "Pooling" bottom: "conv10" top: "pool10" pooling_param { pool: AVE global_pooling: true } }
Hmm... no reason. Feel free to try inference (or retrain + inference) without this final ReLU.
Hi, I have noticed that you put ReLU after classifier, which is not a common practice. Is there some reason for it?