ngxson / wllama

WebAssembly binding for llama.cpp - Enabling on-browser LLM inference
https://huggingface.co/spaces/ngxson/wllama
MIT License
444 stars 23 forks source link

main: initialize main example #96

Closed ngxson closed 3 months ago

ngxson commented 3 months ago

initialize "main" example with:

functionalities:

reference UI (built from off-the-shelf daisyui components)

Image

felladrin commented 3 months ago

The UI is amazing! Thanks for adding that! The model loading feature is marvelous! It's the perfect playground to test models before deciding to use them!

ngxson commented 3 months ago

@felladrin Yup thanks! Feel free to report bugs if you see one (the UI is still in alpha version btw)