Open palindsay opened 2 months ago
It would nice to be able to use ollama with local LLMs versus github co-pilot.
Yeah. This would be great. Ollama does provide a completions API endpoint so there should be a way.
It would nice to be able to use ollama with local LLMs versus github co-pilot.