ngxson / wllama

WebAssembly binding for llama.cpp - Enabling on-browser LLM inference
https://huggingface.co/spaces/ngxson/wllama
MIT License
448 stars 23 forks source link

Feature request: Github build workflow #6

Open flatsiedatsie opened 7 months ago

flatsiedatsie commented 7 months ago

It would rock if there was always an cutting-edge pre-compiled distribution available to download from github, based on the latest code.

ngxson commented 7 months ago

Yes it would be an useful thing to add.

My idea is to have 2 workflows:

I would be nice to also have some kind of e2e test which opens a headless browser and test it.

However, I don't have much time to burn for the moment. Would be nice if someone can help me out with this.