This PR adds a small function to use try/except, instead of calling ollama.list() directly.
I wanted this as my ollama instance(s) are often offline, and llm-ollama treats that as fatal and stops llm-cli running.
I think its cleaner to return an empty list, and let llm-cli continue.
In my case, this means if my local docker ollama isn't running or my desktop is off, I can still use other llms without using LLM_LOAD_PLUGINS to work around it.
This PR adds a small function to use try/except, instead of calling ollama.list() directly.
I wanted this as my ollama instance(s) are often offline, and llm-ollama treats that as fatal and stops llm-cli running.
I think its cleaner to return an empty list, and let llm-cli continue.
In my case, this means if my local docker ollama isn't running or my desktop is off, I can still use other llms without using
LLM_LOAD_PLUGINS
to work around it.