Open sammcj opened 2 months ago
The project looks neat, but I can't see a way to configure local LLM servers such as Ollama.
Is there somewhere you can set an OpenAI compatible API endpoint?
we have not integrated Ollama yet, but it's in our roadmap. we'll ping you once local LLM support is added.
The project looks neat, but I can't see a way to configure local LLM servers such as Ollama.
Is there somewhere you can set an OpenAI compatible API endpoint?