Closed cocorecoco closed 4 years ago
Hi @cocorecoco,
The ONNX parser supports the PRelu op according to here: https://github.com/onnx/onnx-tensorrt/blob/master/operators.md
You could try to convert your model to ONNX and then use the ONNX parser to create a TensorRT network instead.
You can reference the samples for examples of using the ONNX parser.
If it doesn't work with the ONNX parser that comes with your TensorRT installation, you may need to update the parser (which is an open source component) by following the steps in this repo's readme: https://github.com/NVIDIA/TensorRT/blob/master/README.md
This also may be related to #179
As far as the Caffe parser goes, this may be related to https://github.com/NVIDIA/TensorRT/issues/179.
But, ONNX parser still may work.
@rmccorm4 How to call PReLU in tensorrt network API? it seems this code not work it will through undefined error:
plugin = plugin::createPReLUPlugin(serialData, serialLength);
error:
undefined reference to `nvinfer1::plugin::createPReLUPlugin(float)'
@rmccorm4 How to call PReLU in tensorrt network API? it seems this code not work it will through undefined error:
plugin = plugin::createPReLUPlugin(serialData, serialLength);
error:
undefined reference to `nvinfer1::plugin::createPReLUPlugin(float)'
Got the same problem. Did u solve it?
trt: 6.0.1.5 cuda 10.1 cudnn 7.6.3 When I try to convert a model from caffe with PReLU layer
I Got Error Like:
ERROR: Parameter check failed at: ../builder/Network.cpp::addConstant::562, condition: allDimsGtEq(dimensions, 1)
`