Open June1124 opened 6 days ago
Have you tried OnnxRuntime 1.18.0 ? And which QNN SDK version are you using?
Have you tried OnnxRuntime 1.18.0 ? And which QNN SDK version are you using?
QNN SDK:2.18.0.240101
I confirmed the onnxruntime release notes that onnx 1.17.0 supports qnn 2.18.
Have you tried OnnxRuntime 1.18.0 ? And which QNN SDK version are you using?
Do I need to test in onnxruntime 1.18.0?
Describe the issue
yolov8-pose inference using onnxruntime, EP for GPUs, QUALCOMM 8155![image](https://github.com/microsoft/onnxruntime/assets/52447302/9e8fcbfa-a7ab-4f58-ac46-70f369ddd6fd)
To reproduce
// qnn_options["profiling_level"] = "basic";//QNN profiling level, options: 'basic', 'detailed', default 'off'. // qnn_options["htp_performance_mode"] = "sustained_high_performance";//QNN performance mode, options: 'burst', 'balanced', 'default', 'high_performance', 'high_power_saver', 'low_balanced', 'extreme_power_saver', 'low_power_saver', 'power_saver', 'sustained_high_performance'. Default to 'default'. sessionOptions.AppendExecutionProvider("QNN", qnn_options);
Urgency
No response
Platform
Android
OS Version
9.0
ONNX Runtime Installation
Built from Source
Compiler Version (if 'Built from Source')
31
Package Name (if 'Released Package')
None
ONNX Runtime Version or Commit ID
1.17.0
ONNX Runtime API
C++/C
Architecture
ARM64
Execution Provider
SNPE, Other / Unknown
Execution Provider Library Version
QNN