I have an RT-DETR (l) model trained using ultralytics. Naturally, I used the ultralytics-rtdetr script to get an ONNX file. As I wanted a dynamic batch size, I use
[08/14/2024-01:28:47] [W] [TRT] onnx2trt_utils.cpp:364: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[08/14/2024-01:28:47] [W] [TRT] onnx2trt_utils.cpp:390: One or more weights outside the range of INT32 was clamped
[08/14/2024-01:28:47] [E] [TRT] ModelImporter.cpp:720: While parsing node number 2 [Pad -> "/0/model.0/Pad_output_0"]:
[08/14/2024-01:28:47] [E] [TRT] ModelImporter.cpp:721: --- Begin node ---
[08/14/2024-01:28:47] [E] [TRT] ModelImporter.cpp:722: input: "/0/model.0/stem1/act/Relu_output_0"
input: "/0/model.0/Reshape_1_output_0"
input: ""
output: "/0/model.0/Pad_output_0"
name: "/0/model.0/Pad"
op_type: "Pad"
attribute {
name: "mode"
s: "constant"
type: STRING
}
[08/14/2024-01:28:47] [E] [TRT] ModelImporter.cpp:723: --- End node ---
[08/14/2024-01:28:47] [E] [TRT] ModelImporter.cpp:726: ERROR: builtin_op_importers.cpp:2990 In function importPad:
[8] Assertion failed: inputs.at(2).is_weights() && "The input constant_value is required to be an initializer."
[08/14/2024-01:28:47] [E] Failed to parse onnx file
[08/14/2024-01:28:47] [I] Finish parsing network model
[08/14/2024-01:28:47] [E] Parsing model failed
[08/14/2024-01:28:47] [E] Engine creation failed
[08/14/2024-01:28:47] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec [TensorRT v8001] # /usr/src/tensorrt/bin/trtexec --onnx=best.onnx --minShapes=input:1x3x384x640 --optShapes=input:2x3x384x640 --maxShapes=input:40x3x384x640 --fp16 --saveEngine=engines/best.engine
Hi,
I have an RT-DETR (l) model trained using ultralytics. Naturally, I used the ultralytics-rtdetr script to get an ONNX file. As I wanted a dynamic batch size, I use
This completes normally. When I export the engine using this onnx,
I get this error