quic / ai-hub-models

The Qualcomm® AI Hub Models are a collection of state-of-the-art machine learning models optimized for performance (latency, memory etc.) and ready to deploy on Qualcomm® devices.
https://aihub.qualcomm.com
BSD 3-Clause "New" or "Revised" License
438 stars 60 forks source link

No Python on Windows ARM available for < 3.11 #89

Closed nsteblay closed 6 days ago

nsteblay commented 1 month ago

How can I run the demos on a new CoPilot+ PC when the Qualcomm libraries require Python < 3.11? Python on Windows ARM is only supported for >= 3.11.

EwoutH commented 1 month ago

Python platform support for Windows Arm64 has improved significantly in recent versions. I can perfectly understand the Python >= 3.11 requirement.

Why do you want to use such an old Python version?

nsteblay commented 1 month ago

Installation (From Qualcomm documentation)

We currently support Python >=3.8 and <= 3.10. We recommend using a Python virtual environment (miniconda or virtualenv).

I get the following errors when using Python 3.11.9 ...

pip install "qai_hub_models[llama_v3_8b_chat_quantized]"

ERROR: Ignored the following versions that require a different python version: 0.0.0 Requires-Python >=3.8, <3.11; 0.10.0 Requires-Python <3.11,>=3.8; 0.11.0 Requires-Python <3.11,>=3.8; 0.11.1 Requires-Python <3.11,>=3.8; 0.11.2 Requires-Python <3.11,>=3.8; 0.11.3 Requires-Python <3.11,>=3.8; 0.2.0 Requires-Python >=3.8, <3.11; 0.2.1 Requires-Python >=3.8, <3.11; 0.2.2 Requires-Python >=3.8, <3.11; 0.2.3 Requires-Python >=3.8, <3.11; 0.2.5 Requires-Python >=3.8, <3.11; 0.2.6 Requires-Python >=3.8, <3.11; 0.2.7 Requires-Python >=3.8, <3.11; 0.2.71 Requires-Python >=3.8, <3.11; 0.3.0 Requires-Python >=3.8, <3.11; 0.3.1 Requires-Python >=3.8, <3.11; 0.3.2 Requires-Python >=3.8, <3.11; 0.4.0 Requires-Python >=3.8, <3.11; 0.4.1 Requires-Python <3.11,>=3.8; 0.5.0 Requires-Python <3.11,>=3.8; 0.5.1 Requires-Python <3.11,>=3.8; 0.6.0 Requires-Python <3.11,>=3.8; 0.7.0 Requires-Python <3.11,>=3.8; 0.8.0 Requires-Python <3.11,>=3.8; 0.9.0 Requires-Python <3.11,>=3.8; 0.9.2 Requires-Python <3.11,>=3.8 ERROR: Could not find a version that satisfies the requirement qai_hub_models[llama_v3_8b_chat_quantized] (from versions: none) ERROR: No matching distribution found for qai_hub_models[llama_v3_8b_chat_quantized]

EwoutH commented 1 month ago

Sorry, I read it the other way around, that you wanted support for 3.10 and lower.

Support for 3.11 higher would be obvious indeed.

nsteblay commented 1 month ago

I thought I might be missing something. Documentation states that the Snapdragon X Elite is supported, but I'm not sure how I can run the examples if the appropriate Python version isn't supported. I don't understand how this could have possibly been tested.

gustavla commented 1 month ago

We recommend installing x86-64 Python because there are dependent packages that aren't available as ARM64 wheels yet (see "Windows on ARM" in our documentation: https://app.aihub.qualcomm.com/docs/hub/getting_started.html#installation). It will take a bit for the package providers to catch up to this new platform. We apologize for this inconvenience as the ecosystem catches up to native ARM64 wheels.

For instance, for the AI Hub client to communicate with AI Hub, we require h5py. This package is not available as ARM64 wheel for Windows (https://pypi.org/project/h5py/#files). Installing this dependency and others from source is possible, but non-trival. There is built-in emulation support so the x86-64 Python will work out-of-the-box.

We will separately work on expanding the supported version beyond 3.10. Thanks for the feedback!

nsteblay commented 1 month ago

Awesome! Thanks for the info, appreciate it.