WebAssembly binding for llama.cpp - Enabling on-browser LLM inference
451
stars
23
forks
source link
Error "llama_model_load: error loading model: illegal split file: <number>, model must be loaded with the first split" #135
Open
felladrin opened 5 hours ago
While setting up v2.0, I've noticed it's not able to load this model:
[Actually, the same error is happening with all models that I split (which were all working on v1).]
It downloads the models correctly, but if triggers the following error when loading the model:
Any clues?
Device info
OS: MacOS Browser: Tested on Brave, Chromium, and Safari
How to reproduce
Felladrin/gguf-Q8_0-SmolLM2-135M-Instruct
andmodel.shard-00001-of-00005.gguf
, and save.