ngxson / wllama

WebAssembly binding for llama.cpp - Enabling in-browser LLM inference
https://ngxson.github.io/wllama/examples/basic/
MIT License
231 stars 5 forks source link

Add WebGPU support #66

Open ngxson opened 3 weeks ago

ngxson commented 3 weeks ago

We're having not much details for now. This issue is currently for keeping track or upstream issue: https://github.com/ggerganov/llama.cpp/issues/7773