microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.33k stars 2.87k forks source link

windows arm64(Snapdragon(R) X 12-core X1E80100 @ 3.40 GHz) [Feature Request] #21295

Open latermarch opened 3 months ago

latermarch commented 3 months ago

Describe the feature request

Ort::Env env = Ort::Env{ORT_LOGGING_LEVEL_ERROR, "Default"}; std::unordered_map<std::string, std::string> qnn_options; qnn_options["backend_path"] = "QnnHtp.dll"; Ort::SessionOptions session_options; session_options.AppendExecutionProvider("QNN", qnn_options); Ort::Session session(env, model_path, session_options);

the code crash at AppendExecutionProvider
onnxruntime 1.18.1 qnn sdk version 2.22.10

Describe scenario use case

code crash at session_options.AppendExecutionProvider("QNN", qnn_options);

latermarch commented 3 months ago

image