Open NightMachinery opened 2 months ago
Agreed. And be able to manage context properly. Try one of tabby integrations for now, like this one, it is in very long term plan. Or maybe [help wanted]
@s-kostyaev Thanks. What models does tabby support? Skimming its docs, it seems to be a proprietary third-party LLM reseller?
I prefer to use deepseek coder 6.7b with tabby - it has very good performance for my use cases.
Skimming its docs, it seems to be a proprietary third-party LLM reseller?
No. It pulls models from hf and run it locally.
One of the possible solutions can be using https://code.bsdgeek.org/adam/corfu-candidate-overlay with usual capf.
ellama-code-complete
inserts the llm output directly into the buffer which is not very ideal. Copilot's "ghost completions" that overlay a short gray output that you can confirm later on is more useful. https://github.com/zerolfx/copilot.el does this, but it only supports Microsoft Copilot.Also, there should be a way to limit the completions to at most N lines and M characters.