rwth-i6 / i6_models

Collection of NN-Model parts
Mozilla Public License 2.0
1 stars 0 forks source link

set explicit execution provider #36

Closed JackTemaki closed 12 months ago

albertz commented 12 months ago

Why is this needed? The test was already working without that, or not?

If there is a reason this is needed, I think this reason should also be stated as a comment in the code.

If there is no reason for this change, then I don't understand why to make this change.

JackTemaki commented 12 months ago

I wanted to secretly just test and merge this without comment, but then the tests did not start.

The reason for this PR is the failed test:

FAILED tests/test_blstm.py::test_blstm_onnx_export - ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)

This is likely because onnxruntime was updated from 1.15 to 1.16

JackTemaki commented 12 months ago

Test is running now, so I just merge.