Closed flatsiedatsie closed 3 months ago
You're using an old gguf. For more info: https://github.com/ggerganov/llama.cpp/pull/8627#issuecomment-2260315554
Ah, thank you!
Unfortunately this isn't a model I can easily replace, as it's a specialized model (Dutch language). I'll check if there is a new version of it. But if not, is there something I can do to override this manually?
// No new version, though I've asked if one is on the horizon.
You can use play with this script to add the missing metadata: https://github.com/ggerganov/llama.cpp/blob/master/gguf-py/scripts/gguf_set_metadata.py
It would be nice to have a default value in llama.cpp code, so old models won't break. I'll have a look on this later
This should be fixed in the latest release
Absoutely brilliant. I'm so impressed you made an upstream fix. Thank you!
Just a quick question: I take it this is an issue with the model? Or is there something I can do to fix this? Perhaps add the value manually?
Hmm, I'm acutally pretty sure I was able to run this model in the past. Maybe something changed in llama.cpp?
I did just switch to preloading the model separately from starting it. My preload code: