When running the model with ORT + QNN EP, we get an error from QNN. Unfortunately it is a generic error code. It is tempting to blame QNN, however, building a precompiled context binary and running that version succeeds.
This fails the same way on both Android and Windows.
This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.
Describe the issue
When running the model with ORT + QNN EP, we get an error from QNN. Unfortunately it is a generic error code. It is tempting to blame QNN, however, building a precompiled context binary and running that version succeeds.
This fails the same way on both Android and Windows.
Here is the Android failure. (Note that this job is only visible to members of the QNN Execution Provider team.) https://app.aihub.qualcomm.com/jobs/jp4ll8185
To reproduce
Run the model attached to the AI Hub job specified.
Urgency
We see this error almost daily with different models.
Platform
Android/Windows
OS Version
Android 13 / Windows 11 + Snapdragon X Elite)
ONNX Runtime Installation
Built from Source
Compiler Version (if 'Built from Source')
No response
Package Name (if 'Released Package')
None
ONNX Runtime Version or Commit ID
1.19.2
ONNX Runtime API
C++/C
Architecture
ARM64
Execution Provider
Other / Unknown
Execution Provider Library Version
QNN