Open DaramG opened 1 year ago
It didn't solve my issue (#95).
This worked for me, thanks. I was having the same issue as #95. I updated the version number then restarted the server and it loaded the model ok.
I get a similar error when loading the 7b chat model but that's due to it being in .bin format instead of .gguf like code-7b and gives the following error, which shows the 500 loading error in the UI: gguf_init_from_file: invalid magic characters tjgg. error loading model: llama_model_loader: failed to load model from ./models/llama-2-7b-chat.bin
i am M1Pro base version,whitch version should i install by llama_cpp_python
There was some internal server error when running
run-mac.sh
on Mac. To fix this, I updated llama_cpp_python version to the latest version. This will resolve #73 and #95 .