lewes6369 / TensorRT-Yolov3

TensorRT for Yolov3
MIT License
487 stars 165 forks source link

About leaky layer and upsample layer #34

Open faedtodd opened 5 years ago

faedtodd commented 5 years ago

[tensorRTWrapper/code/include/PluginFactory.h] line45-line50:

if(isLeakyRelu(layerName)) { assert(nbWeights == 0 && weights == nullptr); mPluginLeakyRelu.emplace_back(std::unique_ptr<INvPlugin, void()(INvPlugin)>(createPReLUPlugin(NEG_SLOPE), nvPluginDeleter)); return mPluginLeakyRelu.back().get(); ...

who can tell me the meaning of this

lewes6369 commented 5 years ago

Because the TensorRT parser can not handle the negative slop directly ( the leakyRelu vesion). So I add it as the plugin. As written in the tensorrt header, the PReLu plugin layer performs leaky ReLU for 4D tensors. Give an input value x, the PReLU layer computes the output as x if x > 0 and negative_slope //! x if x <= 0.

faedtodd commented 5 years ago

Because the TensorRT parser can not handle the negative slop directly ( the leakyRelu vesion). So I add it as the plugin. As written in the tensorrt header, the PReLu plugin layer performs leaky ReLU for 4D tensors. Give an input value x, the PReLU layer computes the output as x if x > 0 and negative_slope //! x if x <= 0.

so, relu layer ganna be repaleced by PRelu layer which has the same performance with leaky layer and upsample layer will replaced by your upsample layer cause caffe doesnt have one

lewes6369 commented 5 years ago

Yes, in the yolov3 model , the relu layer is actually the leaky relu layer. And the upsample layer is not supported by the default TensorRT, so add it as plugins. To gain the same result in the tensorrt, we have to do like this.