The Qualcomm® AI Hub Models are a collection of state-of-the-art machine learning models optimized for deployment on Qualcomm® devices.
See supported: On-Device Runtimes, Hardware Targets & Precision, Chipsets, Devices
The package is available via pip:
# NOTE for Snapdragon X Elite users:
# Only AMDx64 (64-bit) Python in supported on Windows.
# Installation will fail when using Windows ARM64 Python.
pip install qai_hub_models
Some models (e.g. YOLOv7) require additional dependencies that can be installed as follows:
pip install "qai_hub_models[yolov7]"
Many features of AI Hub Models (such as model compilation, on-device profiling, etc.) require access to Qualcomm® AI Hub:
qai-hub configure --api_token API_TOKEN
All models in our directory can be compiled and profiled on a hosted Qualcomm® device:
pip install "qai_hub_models[yolov7]"
python -m qai_hub_models.models.yolov7.export [--target-runtime ...] [--device ...] [--help]
Using Qualcomm® AI Hub, the export script will:
Most models in our directory contain CLI demos that run the model end-to-end:
pip install "qai_hub_models[yolov7]"
# Predict and draw bounding boxes on the provided image
python -m qai_hub_models.models.yolov7.demo [--image ...] [--on-device] [--help]
End-to-end demos:
Many end-to-end demos use AI Hub to run inference on a real cloud-hosted device (if the --on-device
flag is set). All end-to-end demos also run locally via PyTorch.
Native applications that can run our models (with pre- and post-processing) on physical devices are published in the AI Hub Apps repository.
Python applications are defined for all models (from qai_hub_models.models.\
Runtime | Supported OS |
---|---|
Qualcomm AI Engine Direct | Android, Linux, Windows |
LiteRT (TensorFlow Lite) | Android, Linux |
ONNX | Android, Linux, Windows |
Device Compute Unit | Supported Precision |
---|---|
CPU | FP32, INT16, INT8 |
GPU | FP32, FP16 |
NPU (includes Hexagon DSP, HTP) | FP16*, INT16, INT8 |
*Some older chipsets do not support fp16 inference on their NPU.
and many more.
and many more.
Model | README |
---|---|
OpenAI-Clip | qai_hub_models.models.openai_clip |
TrOCR | qai_hub_models.models.trocr |
Slack: https://aihub.qualcomm.com/community/slack
GitHub Issues: https://github.com/quic/ai-hub-models/issues
Email: ai-hub-support@qti.qualcomm.com.
Qualcomm® AI Hub Models is licensed under BSD-3. See the LICENSE file.