microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
13.6k stars 2.77k forks source link

Integrate with ONNX 1.16.0 release branch #19783

Open cjvolzka opened 4 months ago

cjvolzka commented 4 months ago

Describe the feature request

We are releasing ONNX 1.16.0. A release branch is created (https://github.com/onnx/onnx/tree/rel-1.16.0). The planned release date is March 25, 2024. Release candidates are also available from TestPyPI: pip install -i https://test.pypi.org/simple/ --pre onnx

It is important to integrate ONNX release branch into ORT ASAP so that any issues and incompatibilities can be detected and resolved before the ONNX release.

Please follow instructions at (https://github.com/microsoft/onnxruntime/blob/master/docs/How_To_Update_ONNX_Dev_Notes.md) to integrate with the ONNX release branch. Please implement CPU kernels for new and updated ONNX ops. A list of new and updated ops can be found at (https://github.com/onnx/onnx/wiki/Logistics-for-ONNX-Release-1.16.0).

Key updates:

In case a bug in ONNX is detected during integration of ONNX 1.16.0, please open a ONNX Bug Report and tag ONNX Release Manager @cjvolzka so that the bug is fixed in the ONNX release branch.

Describe scenario use case

ORT is integrated with ONNX rel-1.16.0 with all CI pipelines passing.

cjvolzka commented 4 months ago

Related PR: https://github.com/microsoft/onnxruntime/pull/19745

jkosek commented 3 months ago

Is there any timeline for support of onnx 1.16.0 and IR 10 in onnxruntime?

EwoutH commented 2 months ago

Since #19745 is merged, is this issue fully completed?

cjvolzka commented 2 months ago

Since https://github.com/microsoft/onnxruntime/pull/19745 is merged, is this issue fully completed?

It appears so. @liqunfu, was everything covered or is there something noteworthy to keep this issue open?