Closed MB-Finski closed 8 months ago
What's with the failing CI?
What's with the failing CI?
Hmm.. I think it might be related to this: https://github.com/ggerganov/llama.cpp/issues/3484
So we may need to update the falcon gguf model file.
EDIT: i.e. updating the deps caused this.
Yup.. We will have to update the falcon model with this: https://gpt4all.io/models/gguf/gpt4all-falcon-newbpe-q4_0.gguf
This will close https://github.com/nextcloud/llm/issues/29
Adds support for the NeuralBeagle14 7B 8K context length model.