jkjung-avt / tensorrt_demos

TensorRT MODNet, YOLOv4, YOLOv3, SSD, MTCNN, and GoogLeNet
https://jkjung-avt.github.io/
MIT License
1.74k stars 545 forks source link

could not find any supported formats consistent with input/output data types) #521

Closed Im-JimmyHu closed 2 years ago

Im-JimmyHu commented 2 years ago

Hi jk.while Demo #5: YOLOv4 runs well on jetpack4.5.1 and trt7.1.3 ,after I upgrade to jetpack4.6 and tensorrt8.0 ,the error above happened.the error occurred when i run the command:

python3 onnx_to_tensorrt.py -m yolov4-416

the complete log like this:

Loading the ONNX file... Adding yolo_layer plugins. Adding a concatenated output as "detections". Naming the input tensort as "input". Building the TensorRT engine. This would take a while... (Use "--verbose" or "-v" to enable verbose logging.) [TensorRT] WARNING: Detected invalid timing cache, setup a local cache instead [TensorRT] ERROR: 9: [pluginV2Builder.cpp::reportPluginError::21] Error Code 9: Internal Error ((Unnamed Layer* 506) [PluginV2IOExt]: could not find any supported formats consistent with input/output data types) ERROR: failed to build the TensorRT engine!

however the command yolo-onnx runs well, the log like this:

%159_convolutional_lrelu = LeakyRelualpha = 0.100000001490116 %160_convolutional = Conv[auto_pad = 'SAME_LOWER', dilations = [1, 1], kernel_shape = [3, 3], strides = [1, 1]](%159_convolutional_lrelu, %160_convolutional_conv_weights) %160_convolutional_bn = BatchNormalization[epsilon = 9.99999974737875e-06, momentum = 0.990000009536743](%160_convolutional, %160_convolutional_bn_scale, %160_convolutional_bn_bias, %160_convolutional_bn_mean, %160_convolutional_bn_var) %160_convolutional_lrelu = LeakyRelualpha = 0.100000001490116 %161_convolutional = Conv[auto_pad = 'SAME_LOWER', dilations = [1, 1], kernel_shape = [1, 1], strides = [1, 1]](%160_convolutional_lrelu, %161_convolutional_conv_weights, %161_convolutional_conv_bias) return %139_convolutional, %150_convolutional, %161_convolutional } Checking ONNX model... Saving ONNX file... Done.

to find where the error from, I try the onnx (got form the jetpack4.5.1,which once turned out good) to exucte the command

python3 onnx_to_tensorrt.py -m yolov4-416

but i got the same error that i don't how to deal with it. Any advice will be appreciated!

jkjung-avt commented 2 years ago

Could you try to modify the source code and rebuild the "yolo_layer" plugin, to see if it resolves the problem?

https://github.com/jkjung-avt/tensorrt_demos/blob/fb2f075cc44a344062b31fa924bb9c4595ec7de4/plugins/yolo_layer.h#L70

Replace the above line of code with the following:

        bool supportsFormatCombination(int pos, const PluginTensorDesc* inOut, int nbInputs, int nbOutputs) const NOEXCEPT override { return inOut[pos].format == PluginFormat::kLINEAR && inOut[pos].type == DataType::kFLOAT; } 
jkjung-avt commented 2 years ago

Does my previous comment help?

Im-JimmyHu commented 2 years ago

Does my previous comment help?

yeah ,tks,jk