ngxson / wllama

WebAssembly binding for llama.cpp - Enabling on-browser LLM inference
https://huggingface.co/spaces/ngxson/wllama
MIT License
444 stars 23 forks source link

Add WebGPU support #66

Open ngxson opened 5 months ago

ngxson commented 5 months ago

We're having not much details for now. This issue is currently for keeping track or upstream issue: https://github.com/ggerganov/llama.cpp/issues/7773