olimorris / codecompanion.nvim

✨ AI-powered coding, seamlessly in Neovim. Supports Anthropic, Copilot, Gemini, Ollama, OpenAI and xAI LLMs
MIT License
1.02k stars 77 forks source link

[Bug]: Visual Selection + `:CodeCompanion <Prompt>` often replaces selection with a single line of concatenated text #426

Open GitMurf opened 1 day ago

GitMurf commented 1 day ago

Your minimal.lua config

N/A

Error messages

N/A

Log output

N/A

Health check output

N/A

Describe the bug

It seems that when not using a Prompt from the prompt library, and just doing a :CodeCompanion <some question> with a visual selection, the function for generating the selection as a code block is an outlier compared to all the other versions of similar functions in the code base. It is concatenating each line from the selection with a double space " " as opposed to a "\n" line break.

What this leads to is a very inconsistent AI response where sometimes it acts smart enough to try and parse the lines itself without any line breaks and will return a mediocre result split by lines... but most of the time what I get back is a giant single string of the AI also responding with their lines as spaces and no line breaks (likely because it is trying to match what the user sent it).

Here is the code in question:

https://github.com/olimorris/codecompanion.nvim/blob/f0bc99111edad40fcca6628c4cefb38c93a42ab5/lua/codecompanion/strategies/inline.lua#L39-L46

As you notice, in several other places where similar line concatenations are done, it is using a \n which then the AI responses are always very good with the format coming back and actually having lines split properly in the diff view. Here for example is what a lot of the default prompt libraries use:

https://github.com/olimorris/codecompanion.nvim/blob/f0bc99111edad40fcca6628c4cefb38c93a42ab5/lua/codecompanion/helpers/actions.lua#L5-L21

You will notice in the last line it is using the same table.concat() of selection lines but properly uses \n.

Reproduce the bug

Explained in detail above. Here is an example of the issue where 19 lines are coming back to the diff in one single line. To repro I simply:

  1. Visually select some code
  2. :'<,'>CodeCompanion please update my selected code by removing code comments
  3. The diff shows the changes and it replaces the 19 lines with 1 giant concatenated line because it is replicating the format it received it in.

image

And here is what it looked like when selecting it before code companion runs on it:

image

Final checks

olimorris commented 1 day ago

I'd very much welcome a PR on this! Thanks for raising.

GitMurf commented 1 day ago

I'd very much welcome a PR on this! Thanks for raising.

Will do! May not be until tomorrow but hopefully tonight.

e2r2fx commented 1 hour ago

This is a thing that can be solved by adding a system prompt instructing the LLM to keep its responses with the same formatting as the input. Whenever I do inline editing I do that by hand 😅 and it fixes any formatting errors of the response.