Codium-ai / codiumai-jetbrains-release

56 stars 7 forks source link

Support for offline models like Ollama #188

Open anodynos opened 4 months ago

anodynos commented 4 months ago

Do you plan to support local models powered by GPU, instead of having to send our code and pay for ChatGPT 3.5/4 ?

qhreul commented 2 months ago

With the release of new LLM dedicated to coding (e.g. codegemma, etc.), it would be great to be able to connect to a choice of LLMs running locally.

GadiZimerman commented 2 months ago

Depending on your organization's preferences, sending your code to OpenAI may not be required, as CodiumAI offers on-premises solutions. However, local models, such as those running on your edge machine, have not yet achieved the desired quality. Nonetheless, we are closely monitoring advancements in this area.