mlc-ai / web-llm

High-performance In-browser LLM Inference Engine
https://webllm.mlc.ai
Apache License 2.0
13.84k stars 890 forks source link

[FEATURE REQUEST] Add support for NPUs? #612

Open Iternal-JBH4 opened 1 month ago

Iternal-JBH4 commented 1 month ago

As NPUs are becoming more common we see a big opportunity to make the WebLLM solution more easily accessible to mass audiences if support for the NPU is added via WebNN.

Would be curious to get some insight into the status of this effort and if it is on the roadmap.

Thank you!

maru7777 commented 1 month ago

and many linux computer cannot use webgpu in browsers,hope author can add cpu support