For those of us running our LLMs locally, Ollama has been a boon and it provides an OpenAI compatible chat completion API, so it's not too difficult to add support if you are already supporting OpenAI.
I might implement this myself if I can find the spare time.
For those of us running our LLMs locally, Ollama has been a boon and it provides an OpenAI compatible chat completion API, so it's not too difficult to add support if you are already supporting OpenAI.
I might implement this myself if I can find the spare time.