ngxson / wllama

WebAssembly binding for llama.cpp - Enabling in-browser LLM inference
https://huggingface.co/spaces/ngxson/wllama
MIT License
420 stars 21 forks source link

llm_load_vocab: special_eos_id is not in special_eog_ids - the tokenizer config may be incorrect #131

Open flatsiedatsie opened 5 hours ago

flatsiedatsie commented 5 hours ago

I've upgraded to the latest version of Wllama, and I'm doing an experiment with setting the seed value explicitly.

ngxson commented 2 hours ago

The real (and always the first) question is: does it work with native llama.cpp?