The Qualcomm® AI Hub apps are a collection of state-of-the-art machine learning models optimized for performance (latency, memory etc.) and ready to deploy on Qualcomm® devices.
Hello, regarding the binary files of compiled LLMs that can be directly downloaded from AIhub, what is the target device for these files? Can they run on an 8Gen3 chipset?
When I attempted to run inference on them using an 8gen3 smartphone, I encountered the following error:
./genie-t2t-run -c qwen2_7b_instruct_quantized.json -p "<|begin_of_text|><|start_header_id|>user<|end_header_id|>\n\nWhat is France's capital?<|eot_id|><|start_header_id|>assistant<|end_header_id|>"
[INFO] "Using create From Binary"
[INFO] "Allocated total size = 174998016 across 8 buffers"
[ERROR] "Could not create context from binary for context index = 0 : err 5005"
[ERROR] "Create From Binary FAILED!"
Hello, regarding the binary files of compiled LLMs that can be directly downloaded from AIhub, what is the target device for these files? Can they run on an 8Gen3 chipset?
When I attempted to run inference on them using an 8gen3 smartphone, I encountered the following error:
qwen2_7b_instruct_quantized.json :
htp_backend_ext_config.json