David-Kunz / gen.nvim

Neovim plugin to generate text using LLMs with customizable prompts
The Unlicense
992 stars 64 forks source link

Including the contents of the open buffers in the context when $buffers is in the prompt #48

Open kjjuno opened 7 months ago

kjjuno commented 7 months ago

This allows you to specify $buffers in the chat prompt. This will load the contents of each open buffer in the context of the prompt.

kjjuno commented 7 months ago

sorry for the arbitrary reformatting. This is what I'm getting whenever I save with stylua enabled. I can change that if you want. But it might be a good idea to have a standard stylua config for this repo

David-Kunz commented 7 months ago

Thank you, @kjjuno!

I'm not sure if it wouldn't be better to only take the current buffer into account. In general, there are many open buffers and local LLMs don't have a big context window.

Maybe for now, you could choose a similar approach as in https://github.com/David-Kunz/gen.nvim/issues/50 and do it programmatically outside of gen.nvim?

Thanks again and best regards, David

kjjuno commented 7 months ago

@David-Kunz That makes sense to me. It seems to depend quite a bit on which model you choose, and the hardware you have powering ollama. I have an M2 max with 32gb RAM and I've been running codellama:34b. That configuration seems to accept multiple files of context pretty well, though I certainly haven't done a lot testing to see exactly where the limit is on the size of the context.

I would be happy to follow something like https://github.com/David-Kunz/gen.nvim/issues/50 to handle multiple buffers of context. But I like the suggestion of including the current buffer for the Ask prompt. Perhaps something like this?

This is the contents of `/path/to/file.ts`

---
{current buffer contents here}
---

Regarding the following selected text within that file

---
{selected text}
---

{user prompt here}
David-Kunz commented 7 months ago

Thank you, @kjjuno . I think we shouldn't touch the Ask prompt, otherwise it's a breaking change. Also, it would be inconsistent with all other prompts which always take the selected text.

RingOfStorms commented 6 months ago

You can sort of get around a lot of this by dynamically creating a prompt in lua code then making a temporary prompt. I created a function that will use the $register as if it were the current buffer instead since I was having issues yanking the current file before running Gen command.

In this vein, you could iterate over all loaded buffers, or do whatever you want really to create your prompt.

This is done simply like so:

function custom_thing() {
  local g = require('gen')
  g.prompts["tmp"] = { prompt = "Any string here, you can use nvim api to get the current buffer's content for example." }
  vim.cmd("Gen tmp")
  g.prompts["tmp"] = nil
}

Note: You may want to strip out any of the Gen plugin's keywords from your file like $input etc or it will prompt you when you don't expect it.

https://github.com/RingOfStorms/nvim/blob/f57d401b585df48440ab498511cbce80f86dff66/lua/plugins/gen-ollama.lua#L74-L81

Pandoks commented 4 months ago

Would love this feature (anything that will increase the context for the LLM). Maybe individual files in the future too!

David-Kunz commented 4 months ago

Hi,

Yes, it's possible to dynamically set the prompts, I think for now this should be the way to go.

I'm a bit partial about the context, as usually this should be decoupled form the prompt and defined outside, example:

Bad:

prompt1 = "Simplify this text: $buffer"
prompt2 = "Simplify this text: $input"
prompt3 = "Simplify this text: $text"
-- ...

Better:

prompt = "Simplify this text: $context"
invoke(prompt, "buffer")
invoke(prompt, "input")
invoke(prompt, "text")

I have to think more about this.