David-Kunz / gen.nvim

Neovim plugin to generate text using LLMs with customizable prompts
The Unlicense
1.15k stars 87 forks source link

Support to llamacpp #59

Closed JoseConseco closed 7 months ago

JoseConseco commented 8 months ago

With this PR you can get response from llamacpp server Fixes https://github.com/David-Kunz/gen.nvim/issues/54

JoseConseco commented 8 months ago

Also in latest commit I added option to specify model options e.g. : model_options = { min_p = 0.1, temperature = 1.4 } In both user setup(), and per each user defined command There is need for better checkup if user is acutally using llamacpp, or ollama. So that model_options should be ignored if llamacpp server is not used.... Not sure what best way to set it all up.

David-Kunz commented 7 months ago

Hi @JoseConseco ,

Thanks a lot for this PR! I made some minor suggestions, if they're ok, we can merge!