ngxson / wllama

WebAssembly binding for llama.cpp - Enabling on-browser LLM inference
https://huggingface.co/spaces/ngxson/wllama
MIT License
444 stars 23 forks source link

Just a heads up: Wllama crashes on Mobile Chrome DEV #126

Open flatsiedatsie opened 1 month ago

flatsiedatsie commented 1 month ago

Wllama runs great on Mobile Chrome, but for some testing of Speech-to-Text I was trying my project in lots of different mobile browsers, and Wllama crashes on Mobile Chrome DEV. I could debug it further if you like.

// Tried another AI model, same result (and the page crashed)