Closed S1M0N38 closed 3 months ago
That seems super odd. append_to_buf
shouldn't be called as the classification is going straight to the chat strategy.
Could you share all of the logs? I'd expect line 306 to write a log entry of where the LLM thinks your prompt should be placed.
btw 2 things that can be added in the issue template about sharing logs:
api_key
for copilot)btw 2 things that can be added in the issue template about sharing logs:
- logs are stored in .repro/state/nvim/codecompanion.log (not in ~/.config/nvim/state...)
- tell the user to remove secrets (e.g.
api_key
for copilot)
Mighty good spot, thank you
And to check, you're on the latest version of the plugin?
yep, clone from 1.8.1
ive only changed the adapter from openai to copilot
https://github.com/user-attachments/assets/2c7af0fa-3e2a-490e-894c-522f22a8ddf7
I can't recreate this at all with your minimal.lua
config. I pushed some minor updates to inline.lua
earlier but can't believe that's sorted it.
I think I may have found the reason for this issue.
I was running nvim --clean -u minimal.lua
inside the cloned repo olimorris/codecompanion.nvim
. It's possible that when the CodeCompanion plugin was looking for some files, it first looked inside the cloned repo (my current working directory) and not in the location of the plugin installed with lazy (as shown in the first part of the video).
There might be some global variables or states that are set in the plugin installed with lazy but not in the cloned version.
I've tried to reproduce the bug by running nvim --clean ...
from another location, and I wasn't able to: everything works fine. I apologize for wasting your time on this non-existent bug.
I know that this is a kind of edge case, but should users be warned in the ISSUE template not to run nvim --clean ...
inside the codecompanion.nvim
directory?
That's a very point as well. I'll amend the bug report template
Your
minimal.lua
configError messages
Health check output
Log output
Describe the bug
The first chat buffer is unlocked (it works). However, when creating a new chat (new inline command), the chat buffer becomes locked and is only updated when the llm request finishes.
Reproduce the bug
https://github.com/user-attachments/assets/ff58c9f2-bc48-4ea8-95f1-183845922e89
Final checks
minimal.lua
file from above and have shared this