zed-industries / zed

Code at the speed of thought – Zed is a high-performance, multiplayer code editor from the creators of Atom and Tree-sitter.
https://zed.dev
Other
50.91k stars 3.15k forks source link

Expand AI Code Completion beyond Copilot and Supermaven #18490

Open zerocorebeta opened 2 months ago

zerocorebeta commented 2 months ago

Check for existing issues

Describe the feature

After going through: https://zed.dev/docs/completions

Zed currently supports completions via external LLM APIs like GitHub Copilot and Supermaven, but this is restrictive. Many users, for privacy or performance reasons, might prefer alternatives like Gemini Flash or local models via Ollama.

There are several advanced LLMs that support the Fill-in-the-Middle (FIM) objective, such as CodeGemma. Additionally, platforms like Continue.dev allow code completion via local models, with Startcoder via Ollama as the default.

Expanding LLM support to include more flexible, local, or privacy-focused options would greatly enhance Zed's appeal and utility for a wider range of developers.

If applicable, add mockups / screenshots to help present your vision of the feature

No response

ggerganov commented 1 month ago

We have recently extended the llama.cpp server with a specialized /infill endpoint that enables FIM requests with large contexts to run efficiently in local environments. A simple example of using this endpoint with Qwen2.5-Coder in Neovim is demonstrated here: https://github.com/ggerganov/llama.cpp/pull/9787

I believe it could be an interesting option to explore in the scope of this issue. Feel free to ping me if you have any questions.

20manas commented 1 month ago

I think Zed AI should also provide its own code completion functionality.