alondmnt / joplin-plugin-jarvis

Joplin (note-taking) assistant running a very intelligent system (OpenAI/GPT, Hugging Face, Gemini, Llama, Universal Sentence Encoder, etc.)
GNU Affero General Public License v3.0
226 stars 22 forks source link

Ollama support #16

Closed MRP0E closed 8 months ago

MRP0E commented 10 months ago

It would be an awesome feature for Jarvis to use a local LLM via Ollama.

alondmnt commented 10 months ago

I will look into it (no promises). Meanwhile, we currently support the local GPT4All and LM Studio.

dwinkler1 commented 9 months ago

You can serve ollama models via an OpenAI compatible API using LiteLLM. See https://github.com/alondmnt/joplin-plugin-jarvis/pull/19/commits/281fea474507014048c46a346ff1b287cf412d7e

MRP0E commented 9 months ago

@danielw2904 Thats awesome. Thank for the doc. Will try it asap.