lyuwenyu / RT-DETR

[CVPR 2024] Official RT-DETR (RTDETR paddle pytorch), Real-Time DEtection TRansformer, DETRs Beat YOLOs on Real-time Object Detection. 🔥 🔥 🔥
Apache License 2.0
2.61k stars 303 forks source link

转出onnx后对onnx进行simlify出错。 #180

Closed IronmanVsThanos closed 10 months ago

IronmanVsThanos commented 10 months ago

$ python3 export_rtdetr_pytorch.py -w output/rtdetr_r18vd_6x_coco/checkpoint0013.pth -c configs/rtdetr/rtdetr_r18vd_6x_coco.yml --simplify -s 640

Starting: output/rtdetr_r18vd_6x_coco/checkpoint0013.pth Opening RT-DETR PyTorch model

Load PResNet18 state_dict

Exporting the model to ONNX ============= Diagnostic Run torch.onnx.export version 2.0.1+cu117 ============= verbose: False, log level: Level.ERROR ======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

Simplifying the ONNX model Traceback (most recent call last): File "export_rtdetr_pytorch.py", line 110, in sys.exit(main(args)) File "export_rtdetr_pytorch.py", line 83, in main modelonnx, = onnxsim.simplify(model_onnx) File "/home/inviol/.virtualenvs/RTDETR_pytorch/lib/python3.8/site-packages/onnxsim/onnx_simplifier.py", line 199, in simplify model_opt_bytes = C.simplify( onnx.onnx_cpp2py_export.shape_inference.InferenceError: [ShapeInferenceError] (op_type:Mul, node name: /1/Mul): [ShapeInferenceError] Inferred shape and existing shape differ in rank: (3) vs (0) 请问应该如何解决呢?

lyuwenyu commented 10 months ago

这里onnx-simplify是非必要的 可以关掉, 应该和版本有关系

IronmanVsThanos commented 10 months ago

关掉后可以生成onnx模型,但是在deepstream6.0转为.engine的时候会报下面的错误: WARNING: [TRT]: onnx2trt_utils.cpp:366: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32. WARNING: [TRT]: onnx2trt_utils.cpp:392: One or more weights outside the range of INT32 was clamped ERROR: [TRT]: ModelImporter.cpp:773: While parsing node number 444 [GridSample -> "/0/decoder/decoder/layers.0/cross_attn/GridSample_output_0"]: ERROR: [TRT]: ModelImporter.cpp:774: --- Begin node --- ERROR: [TRT]: ModelImporter.cpp:775: input: "/0/decoder/decoder/layers.0/cross_attn/Reshape_5_output_0" input: "/0/decoder/decoder/layers.0/cross_attn/Reshape_6_output_0" output: "/0/decoder/decoder/layers.0/cross_attn/GridSample_output_0" name: "/0/decoder/decoder/layers.0/cross_attn/GridSample" op_type: "GridSample" attribute { name: "align_corners" i: 0 type: INT } attribute { name: "mode" s: "bilinear" type: STRING } attribute { name: "padding_mode" s: "zeros" type: STRING } ERROR: [TRT]: ModelImporter.cpp:776: --- End node --- ERROR: [TRT]: ModelImporter.cpp:779: ERROR: builtin_op_importers.cpp:4870 In function importFallbackPluginImporter: [8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?" ERROR: Failed to parse onnx file ERROR: failed to build network since parsing model errors.

IronmanVsThanos commented 10 months ago

您说的和版本有关系指的是,onnx版本还是?谢谢您的回复

lyuwenyu commented 10 months ago

op_type: "GridSample" attribute { name: "align_corners" i: 0 type: INT } attribute { name: "mode" s: "bilinear" type: STRING } attribute { name: "padding_mode" s: "zeros" type: STRING }

[8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?"

这些算子不支持

IronmanVsThanos commented 10 months ago

那这意味着我必须升级我的tensor rt版本到支持这些算子的版本?

lyuwenyu commented 10 months ago

需要 TensorRT version >= 8.5.1 ( 近期会发布rtdetrv2 升级之后的模型不那么依赖trt版本 可以关注下

IronmanVsThanos commented 10 months ago

哇哦 期待,谢谢你们精彩的工作。