Closed SuperBo closed 1 week ago
Never mind, I found file_path
option. I think this should be documented in wiki.
{
memory = {
file_store = {},
},
models = {
model1 = {
type="llama_cpp",
repository= "Qwen/CodeQwen1.5-7B-Chat-GGUF",
name= "codeqwen-1_5-7b-chat-q4_k_m.gguf",
file_path="/Users/user1/model/codeqwen-1_5-7b-chat-q4_k_m.gguf",
n_ctx=2048,
n_gpu_layers= 1000,
}
}
}
Glad you were able to find it! It is documented under: https://github.com/SilasMarvin/lsp-ai/wiki/Configuration#llamacpp but that configuration page is getting so large it may be worth redoing our docs at some point soon into a small website. I should also note if you are setting the file_path you don't have to set the repository and name.
For now, LSP-AI will try to download model file when config with
llama_cpp
. Are there anyway for user to specify to an already downloaded model in local folder?