Closed kjjuno closed 8 months ago
Hi @kjjuno ,
That's an excellent suggestion and I also thought about this.
Maybe one could add a custom command
:GenPrompt <some prompt>
which sends another prompt and appends the result to the current buffer.
However, that would change the ollama
invocation, I need to think about this.
Will be fixed by https://github.com/David-Kunz/gen.nvim/pull/36
I would like to be able to have an iterative conversation with the AI. This is supported with the web UIs for ollama already.
Example:
user
: Write a lambda in typescript that returns "Hello World"AI
:<lambda code>
user
: Can modify the lambda to get the response message from theprocess_body
method?AI
:<modified lambda code>