ultralytics / ultralytics

NEW - YOLOv8 πŸš€ in PyTorch > ONNX > OpenVINO > CoreML > TFLite
https://docs.ultralytics.com
GNU Affero General Public License v3.0
28.64k stars 5.69k forks source link

yolov10x to engine int8 error #16415

Open pax7 opened 11 hours ago

pax7 commented 11 hours ago

Search before asking

Question

Hello,

I am having a problem exporting a yolov10x det to engine with int8=True

I was able to export 10x in the past, i am not sure what is new here -

Ultralytics YOLOv8.2.98 πŸš€ Python-3.11.9 torch-2.4.1+cu121 CUDA:0 (NVIDIA GeForce RTX 3080 Ti, 12288MiB)
YOLOv10x summary (fused): 503 layers, 31,593,710 parameters, 0 gradients, 169.8 GFLOPs

PyTorch: starting from 'best10x.pt' with input shape (8, 3, 640, 640) BCHW and output shape(s) (8, 300, 6) (61.2 MB)

ONNX: starting export with onnx 1.16.0 opset 19...
ONNX: export success βœ… 7.8s, saved as 'best10x.onnx' (112.3 MB)

TensorRT: starting export with TensorRT 10.0.1...
[-20:55:56] [TRT] [I] [MemUsageChange] Init CUDA: CPU -8, GPU +0, now: CPU 42875, GPU 2561 (MiB)
[-20:56:07] [TRT] [I] [MemUsageChange] Init builder kernel library: CPU +2534, GPU +310, now: CPU 45650, GPU 2871 (MiB)
[-20:56:08] [TRT] [I] ----------------------------------------------------------------
[-20:56:08] [TRT] [I] Input filename:   best10x.onnx
[-20:56:08] [TRT] [I] ONNX IR version:  0.0.9
[-20:56:08] [TRT] [I] Opset version:    19
[-20:56:08] [TRT] [I] Producer name:    pytorch
[-20:56:08] [TRT] [I] Producer version: 2.4.1
[-20:56:08] [TRT] [I] Domain:
[-20:56:08] [TRT] [I] Model version:    0
[-20:56:08] [TRT] [I] Doc string:
[-20:56:08] [TRT] [I] ----------------------------------------------------------------
TensorRT: input "images" with shape(-1, 3, -1, -1) DataType.FLOAT
TensorRT: output "output0" with shape(-1, 300, 6) DataType.FLOAT
TensorRT: building INT8 engine as best10x.engine
TensorRT: collecting INT8 calibration images from 'data=data-int8.yaml'
Scanning labels.cache... 575 images, 0 backgrounds
[-20:56:08] [TRT] [I] Local timing cache in use. Profiling results in this builder pass will not be stored.
[-20:56:08] [TRT] [E] 2: Assertion item.second != nullptr failed. region should have been removed from Graph::regions
[-20:56:08] [TRT] [E] 2: [checkSanity.cpp::nvinfer1::builder::`anonymous-namespace'::checkLinks::207] Error Code 2: Internal Error (Assertion item.second != nullptr failed. region should have been removed from Graph::regions)
TensorRT: export failure ❌ 20.0s: 'NoneType' object does not support the context manager protocol
'NoneType' object does not support the context manager protocol

any idea how to export this?

Additional

No response

UltralyticsAssistant commented 11 hours ago

πŸ‘‹ Hello @pax7, thank you for your interest in Ultralytics πŸš€!

We recommend checking out our Docs where you can explore many Python and CLI usage examples. These resources might have the answers you need.

If this is a πŸ› Bug Report, could you please provide a minimum reproducible example to assist us in debugging the issue?

For custom training questions ❓, providing additional context, such as dataset examples and detailed training logs, will be helpful. Please ensure you are following our Tips for Best Training Results.

Join the Ultralytics community that suits you bestβ€”engage in real-time chat on Discord 🎧, participate in discussions on Discourse, or explore our Subreddit for more insights.

Upgrade

To ensure we're troubleshooting the latest version, upgrade the ultralytics package along with all requirements in a Python>=3.8 environment with PyTorch>=1.8:

pip install -U ultralytics

Environments

YOLOv8 can be tested in any of these verified environments that come preinstalled with all necessary dependencies, including CUDA, Python, and PyTorch:

Status

Ultralytics CI

This badge indicates if all Ultralytics CI tests are currently passing. CI tests check the correct operation of YOLOv8 Modes and Tasks across multiple platforms every 24 hours and upon every commit.

This is an automated response; an Ultralytics engineer will assist you soon. Thank you for your patience 😊!