David-Kunz / gen.nvim

Neovim plugin to generate text using LLMs with customizable prompts
The Unlicense
1.15k stars 92 forks source link

Feat: add support for Deepseek model #115

Open zeroaddresss opened 1 month ago

zeroaddresss commented 1 month ago

Ollama offers the possibility to use Deepseek-coder-v2 model (one of the cheapest and best performing models for coding), but the 236B model is really heavy to be run locally. However, the deepseek API allows to use it, and it has great results as a code assistant. It would be awesome if it could be integrated within the codecompanion plugin.

David-Kunz commented 3 days ago

Thanks for the suggestion, @zeroaddresss .

At the moment, I focus more on open models running locally. But maybe you can override some of the functions to send it to the LLM to support the deepseek API?

Thanks and best regards, David