huggingface / candle

Minimalist ML framework for Rust
Apache License 2.0
15.88k stars 962 forks source link

WebNN support #2604

Open allsey87 opened 2 weeks ago

allsey87 commented 2 weeks ago

I was looking through issues in this repository and noticed that no one has mentioned WebNN before. Is this on any of Candle's roadmaps for doing inference in the browser?

akshayballal95 commented 1 week ago

I believe you can use Candle with WASM to do inference in the browser.

allsey87 commented 1 week ago

Sure, but there is a big difference in performance between using just WASM/SIMD to using it with either WebNN or WebGPU which provide hardware acceleration options.

On Thu, 14 Nov 2024, 22:23 Akshay Ballal, @.***> wrote:

I believe you can use Candle with WASM to do inference in the browser.

— Reply to this email directly, view it on GitHub https://github.com/huggingface/candle/issues/2604#issuecomment-2477437150, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA4626JCFVM4WBYYLYFBBYL2AUIF3AVCNFSM6AAAAABRNIEDZSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZXGQZTOMJVGA . You are receiving this because you authored the thread.Message ID: @.***>

DimitriTimoz commented 6 days ago

Maybe WGPU could be an option to handle different hardware, but since CUDA and Metal are already implemented, WebGPU might be a better option for improved performance and robustness.