David-Kunz / gen.nvim

Neovim plugin to generate text using LLMs with customizable prompts
The Unlicense
992 stars 64 forks source link

feature: Enable Context #36

Closed kjjuno closed 8 months ago

kjjuno commented 8 months ago

This replaces https://github.com/David-Kunz/gen.nvim/pull/31 as that PR was accidentally closed during merge conflict resolution.

David-Kunz commented 8 months ago

Thanks, @kjjuno !

I think we should merge that, but I want to do some additional testing before.

David-Kunz commented 8 months ago

I get the error

^I[C]: in function 'nvim_buf_get_lines'
^I/Users/d065023/projects/nvim/gen.nvim/lua/gen/init.lua:32: in function 'write_to_buffer'
^I/Users/d065023/projects/nvim/gen.nvim/lua/gen/init.lua:245: in function </Users/d065023/projects/nvim/gen.nvim/lua/gen/init.lua:216>

when closing the split and running :Gen with Change Code.

kjjuno commented 8 months ago

What are your thoughts about the container? I can’t run curl against the container so I think we would have to rely on them opening a port to the container and just using the REST api.

kjjuno commented 8 months ago

I get the error

^I[C]: in function 'nvim_buf_get_lines'
^I/Users/d065023/projects/nvim/gen.nvim/lua/gen/init.lua:32: in function 'write_to_buffer'
^I/Users/d065023/projects/nvim/gen.nvim/lua/gen/init.lua:245: in function </Users/d065023/projects/nvim/gen.nvim/lua/gen/init.lua:216>

when closing the split and running :Gen with Change Code.

I’m guessing this is similar to the last bug you reported. I’ll see if I can fix it later today.

David-Kunz commented 8 months ago

I also prefer not to show the prompt, especially for longer prompts, this is quite distracting. Users usually are only interested in the result.

In the future:

kjjuno commented 8 months ago

The invalid buffer bug should be fixed now.

kjjuno commented 8 months ago

I also prefer not to show the prompt, especially for longer prompts, this is quite distracting. Users usually are only interested in the result.

In the future:

  • we might also allow follow up questions
  • we might provide actions to replace the previously highlighted text on demand (and remove the replace = true option)

I personally like the prompt history, especially in a longer iterative chat. However, your point on the longer context makes a lot of sense. How would you feel about showing the first 2 lines or something? I could even wrap that in a feature flag so that you can opt into displaying the prompt and having the default behavior be to hide it.

I'll play around with that idea and push something up.

I also have a local version of https://github.com/David-Kunz/gen.nvim/pull/22 that uses curl instead of plenary. I enjoy using it. I would be happy to add it to this PR if you would be interested. If not I'll just use that locally until https://github.com/David-Kunz/gen.nvim/pull/22 get's merged.

kjjuno commented 8 months ago

I'm thinking of making it look something like this

require("gen").setup({
  model = "codellama:34b",
  show_prompt = true,
})
:Gen Summarize
image
David-Kunz commented 8 months ago

Thank you, @kjjuno . I still find it too much visual clutter, I would also remove the headline # Chat with <model>, that's something the user already knows. Hence I propose to remove everything except the generated text, but since you like to add the prompt, maybe we can add it optionally?

require('gen.nvim').show_prompt = true

And I would show the separator only if a new result is appended.

Otherwise, the PR is good and seems to work on my machine 👍

What do you think?

Thank you and best regards, David

leafo commented 8 months ago

Just wanted to add a note as a daily user of this plugin: I think the change to a split is subjective and would be something I personally would not like. I code with splits, and I feel like plugins that insert a new split often do it under the assumption that someone is looking at a single file. When you already have splits on the screen and a plugin inserts a new split the shift in layout can be jarring and annoying. (I'm trying to ask the LLM a question, not re-arrange my editor!) I think a floating window is perfect fit for this library, as the result of the prompt is a temporary action you're making on the code you're editing.

Edit: Also regarding showing the prompt, I agree it's visual clutter. I tend to have very big prompts for most of my queries as I'm sending in the majority of the text on my screen with my average prompt. If displaying the prompt or part of it is added, then I vote for having the ability to turn off displaying it in the output window.

kjjuno commented 8 months ago

Just wanted to add a note as a daily user of this plugin: I think the change to a split is subjective and would be something I personally would not like. I code with splits, and I feel like plugins that insert a new split often do it under the assumption that someone is looking at a single file. When you already have splits on the screen and a plugin inserts a new split the shift in layout can be jarring and annoying. (I'm trying to ask the LLM a question, not re-arrange my editor!) I think a floating window is perfect fit for this library, as the result of the prompt is a temporary action you're making on the code you're editing.

Edit: Also regarding showing the prompt, I agree it's visual clutter. I tend to have very big prompts for most of my queries as I'm sending in the majority of the text on my screen with my average prompt. If displaying the prompt or part of it is added, then I vote for having the ability to turn off displaying it in the output window.

I would be happy to make the change to splits an opt in feature to maintain what current users are used to.

kjjuno commented 8 months ago

@leafo I should also note that the displaying of the prompt is off by default and is something you would have to opt into.

kjjuno commented 8 months ago

I pushed an update that makes the default behavior exactly the same as before (visually)

however there are options to enable displaying the model and/or prompt. You can also opt into splits instead of the floating window.

kjjuno commented 8 months ago

I still think the docker container feature needs to be considered before merging. As of now this PR breaks the direct integration with a container and would require manual maintenance of your docker container and adds the requirement that they bind a port.

How would you like to see this handled?

David-Kunz commented 8 months ago

Thank you @leafo for your comments and thank you @kjjuno for taking those into account, I tend to agree.

Pro vertical split:

Pro float:

I tend to prefer floats.

@kjjuno I tried your branch, but I don't get any responses. We should also enable

    pcall(io.popen, "ollama serve > /dev/null 2>&1 &")

if no container is used.

We previously had the M.container option, was there a reason you removed it? I think we should restore the former container behaviour (allow to set M.container and change the default command to run it).

What do you think?

David-Kunz commented 8 months ago

@kjjuno I tried your branch, but I don't get any responses. We should also enable

@kjjuno now it seems to work, not sure what happened...

David-Kunz commented 8 months ago

Thanks a ton, @kjjuno, highly appreciated!

kjjuno commented 8 months ago

Thanks a ton, @kjjuno, highly appreciated!

Glad to help! Thanks for taking the addition. It was a really fun project to work on