taketwo / llm-ollama

LLM plugin providing access to local Ollama models using HTTP API
Apache License 2.0
91 stars 6 forks source link

Use try/except to return an empty model list if Ollama is not responding #8

Closed davedean closed 3 months ago

davedean commented 3 months ago

This PR adds a small function to use try/except, instead of calling ollama.list() directly.

I wanted this as my ollama instance(s) are often offline, and llm-ollama treats that as fatal and stops llm-cli running.

I think its cleaner to return an empty list, and let llm-cli continue.

In my case, this means if my local docker ollama isn't running or my desktop is off, I can still use other llms without using LLM_LOAD_PLUGINS to work around it.

taketwo commented 3 months ago

Thanks for reporting and contributing! I've pushed a modified version of your fix and released a new plugin version.