Open ralyodio opened 1 year ago
try using the llama.cpp backend, i think it supports more model types than llm-rs
q5_1 may be supported later. I have not upgrade llm-rs backend for it
my models work fine with llm-rs
@hlhr202 what is q5_1 mean?
try using the llama.cpp backend, i think it supports more model types than llm-rs
how do i do this?
it is a type of ggml model, you can check it on llama.cpp github
try using the llama.cpp backend, i think it supports more model types than llm-rs
how do i do this?
Check it here: https://llama-node.vercel.app/docs/backends/ and here: https://llama-node.vercel.app/docs/backends/llama.cpp/inference
I think this issue also need to investigate the llama.cpp lora support. but I m still reading the llama.cpp implementation. probably will bring this feature soon.
Getting this error
[Error: Could not load model] { code: 'GenericFailure' }
when trying to load a model:I've modified the example a bit to take an argument as
--model