David-Kunz / gen.nvim

Neovim plugin to generate text using LLMs with customizable prompts
The Unlicense
992 stars 64 forks source link

feature request - Add conversation Support #28

Closed kjjuno closed 8 months ago

kjjuno commented 8 months ago

I would like to be able to have an iterative conversation with the AI. This is supported with the web UIs for ollama already.

Example:

user: Write a lambda in typescript that returns "Hello World"

AI: <lambda code>

user: Can modify the lambda to get the response message from the process_body method?

AI: <modified lambda code>

David-Kunz commented 8 months ago

Hi @kjjuno ,

That's an excellent suggestion and I also thought about this.

Maybe one could add a custom command

:GenPrompt <some prompt>

which sends another prompt and appends the result to the current buffer.

However, that would change the ollama invocation, I need to think about this.

kjjuno commented 8 months ago

Will be fixed by https://github.com/David-Kunz/gen.nvim/pull/36