huggingface / llm-ls

LSP server leveraging LLMs for code completion (and more?)
Apache License 2.0
553 stars 43 forks source link

Use as backend for chat-style UI #98

Open raine opened 1 month ago

raine commented 1 month ago

Could this be used as a backend also for a chat-style UI such as the one CopilotChat.nvim provides?

It allows you to interact with an LLM by asking questions about selected parts of code in the open buffer, and to easily apply suggestions as diffs to avoid manual copy-pasting between the chat and editor, as one might do when using ChatGPT in the browser. The issue with CopilotChat.nvim in particular though is that it's not backend agnostic, and only supports GitHub's Copilot.

Maybe this could be something for llm.nvim as well.

Quick demo of how it looks like:

https://github.com/huggingface/llm-ls/assets/11027/e3375a0a-261b-418a-b4b0-6138a860ac89

McPatate commented 1 month ago

This is a big feature I want to add to llm-*! Haven't had the time yet though. Shouldn't be too hard to add it to llm-ls, I'm wondering what the config should look like though. Should we use the same model or a different one, etc.

I think the bigger bulk of the work is on the IDE client side.