microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
13.66k stars 2.78k forks source link

ONNX Runtime and PyTorch results are different #20219

Open W-QY opened 3 months ago

W-QY commented 3 months ago

Describe the issue

I designed and trained a 6D pose estimation algorithm model using pytorch. After that I use torch.onnx.export to convert the pth format parameter file into an onnx inference file. Through comparison, it was found that in some input cases (for example, the target in the image is small and the target background is pure black), the inference results using onnxruntime and pytorch are obviously inconsistent, resulting in a large difference in the results of the two (in this case The error results of both inference results are very large compared with the true value).

I want to know how to reduce or completely avoid the inference differences between onnxruntime and pytorch?

To reproduce

Our algorithm can be found at this link: https://github.com/YangHai-1218/PseudoFlow/blob/69e8e7ad11a2a58f06532cc5b89b76300d83613b/models/estimator/wdr_pose.py

Urgency

No response

Platform

Linux

OS Version

86~20.04.2-Ubuntu

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.15.1

ONNX Runtime API

Python

Architecture

X64

Execution Provider

CUDA

Execution Provider Library Version

CUDA 11.3

W-QY commented 3 months ago

What is certain is that the current onnx inference results are basically consistent with the pth inference results in most cases, but exceptions will occur, and I want to know how to avoid these exceptions.

W-QY commented 3 months ago

I want to know if there are inevitable errors between onnx and pth? Or can we definitely get exactly the same results? Is there some inevitable quantization process when generating onnx?

github-actions[bot] commented 2 months ago

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.