Closed MRP0E closed 8 months ago
You can serve ollama models via an OpenAI compatible API using LiteLLM. See https://github.com/alondmnt/joplin-plugin-jarvis/pull/19/commits/281fea474507014048c46a346ff1b287cf412d7e
@danielw2904 Thats awesome. Thank for the doc. Will try it asap.
It would be an awesome feature for Jarvis to use a local LLM via Ollama.