Open NADOOITChristophBa opened 3 months ago
Using OpenAI on larger codebases and rerunning this often could cause no small cost and also might prevent people from using it.
Ollama has added full support for the OpenAI API for running local models. An option to use a local model would be nice.
relevant link to Ollama: https://ollama.com/blog/openai-compatibility
I'm working on a fork for ollama
Using OpenAI on larger codebases and rerunning this often could cause no small cost and also might prevent people from using it.
Ollama has added full support for the OpenAI API for running local models. An option to use a local model would be nice.
relevant link to Ollama: https://ollama.com/blog/openai-compatibility