gsuuon / model.nvim

Neovim plugin for interacting with LLM's and building editor integrated prompts.
MIT License
293 stars 21 forks source link

Support for connection to llamacpp server #14

Closed JoseConseco closed 9 months ago

JoseConseco commented 9 months ago

This PR will allow to connect to llamacpp server, and fixes https://github.com/gsuuon/llm.nvim/issues/13 . This way llama does not have to be started for each user prompt. This includes completions patch from issue mentioned above, provided by gsuuon. I'm new to this so fell free to fix any issues in this PR.

JoseConseco commented 9 months ago

Ok, I also pushed some changes to default query. By default user can select text, and ask question about it. Best default query format is not clear. In the end from my test I got best results by using: < instr> Fix my code? -- user question or command ''' Code block ''' </ instr> Adding block usually just gave me worse results... It may depend on selected model, not sure. I do not have time to test it more. IMO it is best leave it to user to decide - for best prompt format. That is why I made bunch of helper methods in last commit-with presets of prompts.

JoseConseco commented 9 months ago

Demo of how this works currently (using my custom modify prompt):

https://github.com/gsuuon/llm.nvim/assets/13521338/2cdd5033-142c-4b50-b489-b9a172730f9b

gsuuon commented 9 months ago

Ok, lets get this merged. I'll replace the llamacpp provider and clean up. Thanks for contributing!