THU-MIG / yolov10

YOLOv10: Real-Time End-to-End Object Detection
https://arxiv.org/abs/2405.14458
GNU Affero General Public License v3.0
9.39k stars 883 forks source link

Export TensorRT Error #261

Closed lanyeeee closed 3 months ago

lanyeeee commented 3 months ago

I tried to export TensorRT with this code

from ultralytics import YOLOv10

model = YOLOv10('yolov10s.pt')

model.export(format="engine")

output:

WARNING ⚠️ TensorRT requires GPU export, automatically assigning device=0
Ultralytics YOLOv8.1.34 πŸš€ Python-3.9.19 torch-2.3.1+cu118 CUDA:0 (NVIDIA GeForce RTX 3070 Ti, 8192MiB)
YOLOv10s summary (fused): 293 layers, 8096880 parameters, 0 gradients, 24.8 GFLOPs

PyTorch: starting from 'yolov10s.pt' with input shape (1, 3, 640, 640) BCHW and output shape(s) (1, 300, 6) (31.4 MB)

ONNX: starting export with onnx 1.16.1 opset 17...
ONNX: export success βœ… 0.9s, saved as 'yolov10s.onnx' (27.9 MB)

TensorRT: starting export with TensorRT 8.6.1...
[06/14/2024-11:40:00] [TRT] [I] [MemUsageChange] Init CUDA: CPU +372, GPU +0, now: CPU 19872, GPU 2647 (MiB)
[06/14/2024-11:40:00] [TRT] [I] [MemUsageSnapshot] Begin constructing builder kernel library: CPU 20078 MiB, GPU 2647 MiB
[06/14/2024-11:40:02] [TRT] [I] [MemUsageSnapshot] End constructing builder kernel library: CPU 20469 MiB, GPU 2769 MiB
[06/14/2024-11:40:02] [TRT] [I] ----------------------------------------------------------------
[06/14/2024-11:40:02] [TRT] [I] Input filename:   yolov10s.onnx
[06/14/2024-11:40:02] [TRT] [I] ONNX IR version:  0.0.8
[06/14/2024-11:40:02] [TRT] [I] Opset version:    17
[06/14/2024-11:40:02] [TRT] [I] Producer name:    pytorch
[06/14/2024-11:40:02] [TRT] [I] Producer version: 2.3.1
[06/14/2024-11:40:02] [TRT] [I] Domain:           
[06/14/2024-11:40:02] [TRT] [I] Model version:    0
[06/14/2024-11:40:02] [TRT] [I] Doc string:       
[06/14/2024-11:40:02] [TRT] [I] ----------------------------------------------------------------
[06/14/2024-11:40:02] [TRT] [W] onnx2trt_utils.cpp:365: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[06/14/2024-11:40:02] [TRT] [I] No importer registered for op: Mod. Attempting to import as plugin.
[06/14/2024-11:40:02] [TRT] [I] Searching for plugin: Mod, plugin_version: 1, plugin_namespace: 
[06/14/2024-11:40:02] [TRT] [E] ModelImporter.cpp:748: While parsing node number 342 [Mod -> "/model.23/Mod_output_0"]:
[06/14/2024-11:40:02] [TRT] [E] ModelImporter.cpp:749: --- Begin node ---
[06/14/2024-11:40:02] [TRT] [E] ModelImporter.cpp:750: input: "/model.23/TopK_1_output_1"
input: "/model.23/Constant_13_output_0"
output: "/model.23/Mod_output_0"
name: "/model.23/Mod"
op_type: "Mod"
attribute {
  name: "fmod"
  i: 0
  type: INT
}

[06/14/2024-11:40:02] [TRT] [E] ModelImporter.cpp:751: --- End node ---
[06/14/2024-11:40:02] [TRT] [E] ModelImporter.cpp:754: ERROR: builtin_op_importers.cpp:4951 In function importFallbackPluginImporter:
[8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?"
TensorRT: export failure ❌ 3.9s: failed to load ONNX file: yolov10s.onnx
Traceback (most recent call last):
  File "{project_root}\main.py", line 5, in <module>
    model.export(format="engine")
  File "{project_root}\ultralytics\engine\model.py", line 590, in export
    return Exporter(overrides=args, _callbacks=self.callbacks)(model=self.model)
  File "D:\ProgramData\anaconda3\envs\yolov10\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "{project_root}\ultralytics\engine\exporter.py", line 288, in __call__
    f[1], _ = self.export_engine()
  File "{project_root}\ultralytics\engine\exporter.py", line 138, in outer_func
    raise e
  File "{project_root}\ultralytics\engine\exporter.py", line 133, in outer_func
    f, model = inner_func(*args, **kwargs)
  File "{project_root}\ultralytics\engine\exporter.py", line 688, in export_engine
    raise RuntimeError(f"failed to load ONNX file: {f_onnx}")
RuntimeError: failed to load ONNX file: yolov10s.onnx

Environment

TensorRT Version: 8.6.1 NVIDIA GPU: 3070Ti NVIDIA Driver Version: 522.06 CUDA Version: 11.8 CUDNN Version: 8.9.7.29 Operating System: Windows11 Python Version: 3.9 PyTorch Version: 2.3.1+cu118

In addition to above environment, I have also tried

In all of these environments, the above error occurs.
I did find some similar issues #129 #75 but they did not helped.

lanyeeee commented 3 months ago

TL;DR

update TensorRT, and don't forget to modify the Environment Variables

Details

After an afternoon of troubleshooting, I realized that I had forgotten to modify the environment variables for TensortRT, I was simply updating TensorRT via the .whl file.

This results in TensortRT appearing to Python to have been updated, when in fact the TensortRT in use is still old (the one corresponds to the environment variable).

Now I modify the environment variable to the 8.6.1 version of TensortRT, and it works like a charm.