Closed aropb closed 6 days ago
Rich possibilities of using CPU, GPU, VPU, NPU
SYCL (Intel) is supported in llama.cpp
Rich possibilities of using CPU, GPU, VPU, NPU
SYCL (Intel) is supported in llama.cpp
I think one benefit of supporting the OpenVINO backend is to enable Intel NPU
This issue was closed because it has been inactive for 14 days since being marked as stale.
Prerequisites
Feature Description
OpenVINO backend support request: https://docs.openvino.ai/2024/index.html
Motivation
Rich possibilities of using CPU, GPU, VPU, NPU
Possible Implementation
No response