microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.78k stars 2.94k forks source link

When will v1.20.0 be released for onnxruntime-openvino #22783

Open cheripai opened 1 week ago

cheripai commented 1 week ago

Describe the issue

Hello, I am interested in using v1.20.0 on openvino hardware as the new version claims to have optimized first inference latency. It seems that v1.20.0 has been released for onnxruntime-gpu but not onnxruntime-openvino yet.

Also is there any more information on how much the first inference latency has been improved?

Thanks!

To reproduce

N/A

Urgency

No response

Platform

Linux

OS Version

N/A

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

N/A

ONNX Runtime API

Python

Architecture

X64

Execution Provider

OpenVINO

Execution Provider Library Version

No response

tianleiwu commented 1 week ago

https://github.com/intel/onnxruntime/issues/495