s-kostyaev / ellama

Ellama is a tool for interacting with large language models from Emacs.
GNU General Public License v3.0
348 stars 25 forks source link

[FR] Ghost completions ala Copilot #107

Open NightMachinery opened 2 months ago

NightMachinery commented 2 months ago

ellama-code-complete inserts the llm output directly into the buffer which is not very ideal. Copilot's "ghost completions" that overlay a short gray output that you can confirm later on is more useful. https://github.com/zerolfx/copilot.el does this, but it only supports Microsoft Copilot.

Also, there should be a way to limit the completions to at most N lines and M characters.

s-kostyaev commented 2 months ago

Agreed. And be able to manage context properly. Try one of tabby integrations for now, like this one, it is in very long term plan. Or maybe [help wanted]

NightMachinery commented 2 months ago

@s-kostyaev Thanks. What models does tabby support? Skimming its docs, it seems to be a proprietary third-party LLM reseller?

s-kostyaev commented 2 months ago

I prefer to use deepseek coder 6.7b with tabby - it has very good performance for my use cases.

Skimming its docs, it seems to be a proprietary third-party LLM reseller?

No. It pulls models from hf and run it locally.

s-kostyaev commented 2 months ago

See https://tabby.tabbyml.com/

abcdw commented 1 month ago

One of the possible solutions can be using https://code.bsdgeek.org/adam/corfu-candidate-overlay with usual capf.