leona / helix-gpt

Code assistant language server for Helix with support for Copilot/OpenAI/Codeium/Ollama
MIT License
285 stars 19 forks source link

[Feature request] Add copilot chat as a command and append the results to a buffer #35

Open dc740 opened 4 months ago

dc740 commented 4 months ago

Is your feature request related to a problem? Please describe. I'm still configuring this, but I saw no reference to being able to use Copilot Chat.

Describe the solution you'd like I'd like to run a command in Helix (even if it's an SH command) to ask a question in a copilot chat session, and see the response in a buffer dedicated to this conversation (so I can move it around Helix, split windows, etc). This buffer would contain the question I entered as a command, and the response, and then each new question+response would still be in the buffer separated by some character and new lines.

Describe alternatives you've considered Having a dedicated interactive window does not seem to be supported in helix? It'd be nicer Getting the chat to suggest code would be OK, but having it in a buffer is more than enough.

Additional context Feature in other IDEs: https://docs.github.com/en/copilot/github-copilot-chat/using-github-copilot-chat-in-your-ide

Thanks!!