Closed sam1am closed 1 year ago
I tried this in Windows as well and got the same result.
Thank you for reporting, I am facing the same issue - I am on that :)
The problem is the URL of the model change, for the meantime, you can install the model directory from a link:
catai install https://huggingface.co/eachadea/ggml-vicuna-13b-1.1/resolve/main/ggml-vic13b-q4_0.bin
I tried that install line, and got this one when serving
I know there is a bug with installing by version. I will look into that
Thank you. I really appreciate the work you're putting in here. This sort of library helps simplify the crazy world of open source LLMs.
The problem is a new GGML format that is not yet supported by llama.cpp, for now, you can try the old format:
https://huggingface.co/eachadea/ggml-vicuna-13b-1.1/resolve/819b698c54b906ca7f3e1774cf748c2018932b07/ggml-vic13b-uncensored-q4_0.bin
The model links are now updated, so this issue should be fixed until llama.cpp can handle ggml v3.
Works for me! Thank you sir.
I tried this on the 22nd and was able to install models but not get it to serve (it complains model not found).
With the latest version it doesn't appear to be installing models anymore.
When I run install I just see a cd command echo'd out to the terminal and nothing else. Same thing if I try to run it from that directory.