Closed crypticpi closed 1 year ago
This new model format will be supported in the next version of CatAI. In the meantime, you can install the old format:
catai install https://huggingface.co/eachadea/ggml-vicuna-13b-1.1/resolve/819b698c54b906ca7f3e1774cf748c2018932b07/ggml-old-vic13b-uncensored-q4_2.bin
Thanks for pointing this out, we will have a better solution to format installation in the feature
I get the same error on that model as well
Seems that the link is broken, this one should work:
catai install https://huggingface.co/eachadea/ggml-vicuna-13b-1.1/resolve/819b698c54b906ca7f3e1774cf748c2018932b07/ggml-vic13b-uncensored-q4_0.bin
The model links are now updated, so this issue should be fixed until llama.cpp can handle ggml v3.
Describe the bug I get this error trying to use the vicuna 13b uncensored model
Desktop (please complete the following information):
It works great on the 7B one