Open IntranetFactory opened 3 months ago
Hi @IntranetFactory please ensure you have the latest drivers installed for your NPU
Resolution
I'm using a Copilot+ PC with SnapDragon X CPU - will the intel-npu-library work in that case?
@IntranetFactory Thanks for the confirmation on your running a Qualcomm device see https://learn.microsoft.com/en-us/windows/ai/npu-devices/ for the latest drivers and Onnx runtime support info https://learn.microsoft.com/windows/ai/npu-devices/
Qualcomm Snapdragon X: Currently, developers should target the Qualcomm QNN Execution Provider (EP), which uses the Qualcomm AI Engine Direct SDK (QNN). Pre-built packages with QNN support are available to download. This is the same stack currently used by the Windows Copilot Runtime and experiences on Copilot+ PC Qualcomm devices.
I'm sorry it's the first time I use QNN - what does "target the Qualcomm QNN EP" mean? Do I just need to install that provider or do I also need to modify the cookbook (e.g. change CPU, install nuget packages)?
I installed Microsoft.ML.OnnxRuntime.QNN nuget package. When I select "Any CPU" I get "System.DllNotFoundException: 'Unable to load DLL 'onnxruntime-genai' or one of its dependencies: The specified module could not be found. (0x8007007E)'". in var model = new Model(modelPath);
When I select ARM64 architecture I get a build errors
@IntranetFactory
You are experiencing a DllNotFoundException when trying to use the Microsoft.ML.OnnxRuntime.QNN NuGet package with the “Any CPU” configuration. This error typically occurs when the DLL ‘onnxruntime-genai’ or one of its dependencies is not found by the system.
To resolve this issue, you might want to ensure that:
The NuGet package is properly installed and that all required dependencies are included. The project is configured to copy the native dependencies to the output directory. The native dependencies are compatible with the architecture you’re targeting. If you’re still facing issues, you might consider adding a issues to the Onnx Runtime repo asking them to ensure compatibility with the native dependencies for Snapdragon.
Additionally, checking the documentation for the Microsoft.ML.OnnxRuntime.QNN package for any specific installation or configuration instructions might provide further guidance to see if anyone else has encountered a build error when selecting the ARM64 architecture after installing the same package.
I would suggest reaching out to the maintainers of the Microsoft.ML.OnnxRuntime.QNN package or seek support from the community, as they might have encountered and resolved similar issues.
@IntranetFactory Thank you for your question. I would like to explain that the current example ONNX for Generative AI is based on the x86 framework, and will support the ARM64 architecture in the future. You can learn about the roadmap through the GitHub Repo https://github.com/microsoft/onnxruntime-genai. If you are using Copilot + PC for ARM64, it is recommended that you use Phi-Silica to call https://learn.microsoft.com/en-us/windows/ai/apis/phi-silica
I would love to try phi-scilica - but it seems that it's also not available https://learn.microsoft.com/en-us/windows/apps/windows-app-sdk/experimental-channel Phi Silica and OCR APIs are not included in this release. These will be coming in a future 1.6 release.
or is there any other way to get that?
Are just the C# samples not working on ARM64? So should Python work or does Phi-3 currently not work on ARM64 at all?
The GenAI nugets don't support Arm64 currently, there is an issue tracking this here: https://github.com/microsoft/onnxruntime-genai/issues/637.
Adding @natke for awareness
please use this https://www.nuget.org/packages/Microsoft.ML.OnnxRuntime.QNN
@kinfey I tried that already, which causes build errors https://github.com/microsoft/Phi-3CookBook/issues/84#issuecomment-2217213880
I would love to try phi-scilica - but it seems that it's also not available https://learn.microsoft.com/en-us/windows/apps/windows-app-sdk/experimental-channel
Phi Silica and OCR APIs are not included in this release. These will be coming in a future 1.6 release.
or is there any other way to get that?
Does anyone know more on this?
I'm trying md\07.Labs\Csharp\src\LabsPhi301 on a new Copilot+ laptop. I adjusted modelPath to point to the correct folder.
When I run the lab I get:
Unable to load DLL 'onnxruntime-genai' or one of its dependencies: The specified module could not be found. (0x8007007E)'
After the first failure I updated all nuget packages, but still same result.
Should that sample work on a Copilot+ laptop?