ultralytics / yolov5

YOLOv5 πŸš€ in PyTorch > ONNX > CoreML > TFLite
https://docs.ultralytics.com
GNU Affero General Public License v3.0
50.79k stars 16.36k forks source link

Error export ONNX in FP16 (--half) #6560

Closed bzisl closed 2 years ago

bzisl commented 2 years ago

Search before asking

YOLOv5 Component

Export

Bug

When you exports ONNX with FP16 (half), there is an error and the .onnx file has an error:

python3 export.py --img {imgsize} --device 0 --weights {project}/{name}/weights/best.pt --include onnx --half --simplify

export: data=data/coco128.yaml, weights=['mot/bb_v5n2/weights/best.pt'], imgsz=[2048], batch_size=1, device=0, half=True, inplace=False, train=False, optimize=False, int8=False, dynamic=False, simplify=True, opset=12, verbose=False, workspace=4, nms=False, agnostic_nms=False, topk_per_class=100, topk_all=100, iou_thres=0.45, conf_thres=0.25, include=['onnx']
YOLOv5 πŸš€ v6.0-237-gdc7e093 torch 1.10.1+cu113 CUDA:0 (NVIDIA GeForce RTX 3090, 24268MiB)

Fusing layers... 
Model Summary: 213 layers, 1760518 parameters, 0 gradients, 4.2 GFLOPs

PyTorch: starting from mot/bb_v5n2/weights/best.pt with output shape (1, 258048, 6) (5.8 MB)

ONNX: starting export with onnx 1.10.2...
ONNX: simplifying with onnx-simplifier 0.3.6...
**ONNX: simplifier failure: [ONNXRuntimeError] : 1 : FAIL : Type Error: Type parameter (T) of Optype (Concat) bound to different types (tensor(float) and tensor(float16) in node (Concat_307).**
ONNX: export success, saved as mot/bb_v5n2/weights/best.onnx (7.7 MB)

Environment

VoloV5 V6, RTX 3090,

Minimal Reproducible Example

No response

Additional

No response

Are you willing to submit a PR?

glenn-jocher commented 2 years ago

@bzisl your ONNX model exports correctly, the simplification step fails. Not all arguments are compatible in all combinations, it looks like --simplify is simply not compatible with the rest of your arguments.

bzisl commented 2 years ago

Ok, thanks!

El 8 feb 2022, a las 12:34, Glenn Jocher @.***> escribiΓ³:

@bzisl https://github.com/bzisl your ONNX model exports correctly, the simplification step fails. Not all arguments are compatible in all combinations, it looks like --simplify is simply not compatible with the rest of your arguments.

β€” Reply to this email directly, view it on GitHub https://github.com/ultralytics/yolov5/issues/6560#issuecomment-1032511020, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALCG55AE3LYZLX376TXYWRDU2D5VDANCNFSM5NY3DPRQ. Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub. You are receiving this because you were mentioned.

============================ Paco Blasco - BZ Invest SL

email: @. phone: +34 615 237 233 skype: paco.blasco gmail: @.

BZ Invest SL Buenos Aires 4 46360 BuΓ±ol Valencia (Spain)

bzisl commented 2 years ago

Nope. The problem remains. Without --simplify there is no error message when you call export, but, when you load the model into Onnxruntime:

Load model from bb_v5n_gpu_fp16.onnx failed:Type Error: Type parameter (T) of Optype (Concat) bound to different types (tensor(float) and tensor(float16) in node (Concat_307).

glenn-jocher commented 2 years ago

@bzisl πŸ‘‹ hi, thanks for letting us know about this possible problem with YOLOv5 πŸš€. We've created a few short guidelines below to help users provide what we need in order to get started investigating a possible problem.

How to create a Minimal, Reproducible Example

When asking a question, people will be better able to provide help if you provide code that they can easily understand and use to reproduce the problem. This is referred to by community members as creating a minimum reproducible example. Your code that reproduces the problem should be:

For Ultralytics to provide assistance your code should also be:

If you believe your problem meets all the above criteria, please close this issue and raise a new one using the πŸ› Bug Report template with a minimum reproducible example to help us better understand and diagnose your problem.

Thank you! πŸ˜ƒ

github-actions[bot] commented 2 years ago

πŸ‘‹ Hello, this issue has been automatically marked as stale because it has not had recent activity. Please note it will be closed if no further activity occurs.

Access additional YOLOv5 πŸš€ resources:

Access additional Ultralytics ⚑ resources:

Feel free to inform us of any other issues you discover or feature requests that come to mind in the future. Pull Requests (PRs) are also always welcomed!

Thank you for your contributions to YOLOv5 πŸš€ and Vision AI ⭐!

CVUsers commented 2 years ago

solved it?

wangat commented 2 years ago

hello,I encountered the same problem when using yolov3(windows,pytorch1.10.2, use onnxsim 0.4.7 or not use). When onnxruntime is used, an error is reported: failed:Type Error: Type parameter (T) of Optype (Concat) bound to different types (tensor(float) and tensor(float16) in node (Concat_394). However, in netron, the conversion of fp16 bit model data is successful.

CVUsers commented 2 years ago

chinese people? i have solved it,wechat zxx15277368495 

---Original--- From: @.> Date: Fri, Aug 26, 2022 08:50 AM To: @.>; Cc: @.**@.>; Subject: Re: [ultralytics/yolov5] Error export ONNX in FP16 (--half) (Issue #6560)

yangwenwu92 commented 1 year ago

this might be the problem of yolo .pt model to onnx using --half, concat layer(operation) is not allowed float type date concat with float16 type data.

glenn-jocher commented 12 months ago

@yangwenwu92 Thank you for pointing that out. It seems the specific issue is related to the conversion of .pt model to ONNX using the --half flag. It appears that the concat layer operation is not allowing float type data to be concatenated with float16 type data. We are continuously improving YOLOv5, and your inputs definitely help in this process. If you have any further questions or concerns, feel free to ask!