David-Kunz / gen.nvim

Neovim plugin to generate text using LLMs with customizable prompts
The Unlicense
992 stars 64 forks source link

Adds conversation support #31

Closed kjjuno closed 8 months ago

kjjuno commented 8 months ago

This makes a number of changes. Primarily this adds support for conversational interaction with the AI model. In order to accomplish this I have modified the code to use the REST API. The current state of this PR will probably not work yet for the container workflow... but should be possible to update that to run the curl command inside the container.

This also switches the window from a floating window to a vertical split, which I find much easier to iterate with in a chat scenario.

I wanted to get this in front of you before I spend too much more time on this just to get your feedback on the direction you would like to see this take.

I feel like the changes here actually have the potential to close both of the following tickets:

https://github.com/David-Kunz/gen.nvim/issues/28 https://github.com/David-Kunz/gen.nvim/issues/30

David-Kunz commented 8 months ago

Hi @kjjuno ,

Thank you so much for this PR!

I tried to run it, but I get:

Vim:E474: Attempt to decode a blank string
stack traceback:
^I[C]: in function 'json_decode'
^I/Users/d065023/projects/nvim/gen.nvim/lua/gen/init.lua:149: in function </Users/d065023/projects/nvim/gen.nvim/lua/gen/init.lua:139>
kjjuno commented 8 months ago

interesting. I'll have to look into why. What model are you using?

kjjuno commented 8 months ago

Can you give me an example prompt that you used, and any context that you provided to the prompt?

kjjuno commented 8 months ago

I think I saw your issue. I've added a lot more error checking. It seems to work a lot more consistently on my machine now. Would you mind testing it again?

kjjuno commented 8 months ago

:-( I seem to have broken the prompts that do auto replacement. I will get that fixed.

kjjuno commented 8 months ago

that should be fixed now

kjjuno commented 8 months ago

With regards to the docker container feature...

The ollama/ollama container does not include wget or curl. So, this change to use the REST api seems to not work very well, unless you are ok with removing the option of directly running ollama in the container, and instead have the container bind to a port on the local machine and then use the REST Api against that port.

You would still be able to host ollama in docker, but this plugin would no longer be "docker aware" and would instead just take an api url. This would also have the added benefit of allowing you to target another machine as requested in https://github.com/David-Kunz/gen.nvim/issues/30

What are your thoughts about going this route?

David-Kunz commented 8 months ago

HI @kjjuno ,

That worked, but I still get some issues after closing the buffer:

^I[C]: in function 'nvim_buf_get_lines'
^I/Users/d065023/projects/nvim/gen.nvim/lua/gen/init.lua:32: in function 'write_to_buffer'
^I/Users/d065023/projects/nvim/gen.nvim/lua/gen/init.lua:227: in function </Users/d065023/projects/nvim/gen.nvim/lua/gen/init.lua:198>

To reproduce:

1) Select a text to summarize 2) Close the buffer 3) Select a text to summarize

kjjuno commented 8 months ago

It looks like this is really caused by closing the buffer before the job is finished writing text. So the job tries to write to a non existent buffer. I've added an autocmd that should prevent that scenario.

kjjuno commented 8 months ago

This was accidentally closed attempting to resolve merge conflicts. This has been superceded by https://github.com/David-Kunz/gen.nvim/pull/36