intel / onnxruntime

ONNX Runtime: cross-platform, high performance scoring engine for ML models
MIT License
56 stars 22 forks source link

fix: updated data ops to support the complete graph on OVEP #371

Closed ankitm3k closed 3 months ago

ankitm3k commented 3 months ago

Description

The ONNX model provided by issue author was not fully supported for OVEP and was failing inference with ort_perf_test app. The current PR enables GRU and LogSoftmax Op which helps enable the whole model graph on OVEP during execution.

Also investigating the inference output for multiple iterations for a single common input, the model was giving consistent and correct output across all the inference iterations during testing. Thus solving any post first inference regression of output for the given model architecture.

This PR fixes - https://github.com/microsoft/onnxruntime/issues/19975

sfatimar commented 3 months ago

We do not merge changes to intel::master Please raise New PR