intel / onnxruntime

ONNX Runtime: cross-platform, high performance scoring engine for ML models
MIT License
56 stars 22 forks source link

fix: updated data ops to support the complete graph on OVEP #374

Closed ankitm3k closed 3 months ago

ankitm3k commented 3 months ago

Description

The ONNX model provided by issue author was not fully supported for OVEP and was failing inference with ort_perf_test app. The current PR enables GRU and LogSoftmax Op which helps enable the whole model graph on OVEP during execution. The unit test for GRU op is disabled.

Also investigating the inference output for multiple iterations for a single common input, the model was giving consistent and correct output across all the inference iterations during testing. Thus solving any post first inference regression of output for the given model architecture.

This PR fixes - https://github.com/microsoft/onnxruntime/issues/19975