Closed Beenhakker closed 1 year ago
Hello, after testing locally on a Windows machine (Don't have access to a Mac) I was able to replicate the error with ggml-nous-gpt4-vicuna-13b.bin
.
I suspect that it could be the backend that you are using when loading the model. In my case I tried to load the model using the gptj
backend, but it only works with the llama
backend.
Try changing the backend and let me know if that solves your problem.
Hi,
First off: thanks for this. Exactly what I wanted to build myself.
Second: not all listed models seem to be working on my Mac (Intel).
Working:
Not working: (gptj_model_load: invalid model file (bad magic))