David-Kunz / gen.nvim

Neovim plugin to generate text using LLMs with customizable prompts
The Unlicense
977 stars 62 forks source link

Bring back `generate` support endpoint #97

Open Altiire opened 1 month ago

Altiire commented 1 month ago

As stated in https://github.com/David-Kunz/gen.nvim/pull/68, the chat endpoint is now used and the generate endpoint is not usable anymore.

IMO, supporting both (multiple) endpoints should be a preferable option

David-Kunz commented 1 month ago

Hi @Altiire ,

What advantage does the generate endpoint have?

Thanks and best regards, David

Altiire commented 1 month ago

Hello @David-Kunz ,

The generate endpoint allows to setup the system (see options here: https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-completion)

With the current chat endpoint only supported, I have created many models just to edit the system... :/

Regards,

David-Kunz commented 1 week ago

Hi @Altiire ,

Would it also work for you to loop over all your prompts (require('gen').prompts) and prepend the system message?