av / harbor

Effortlessly run LLM backends, APIs, frontends, and services with one command.
https://github.com/av/harbor
Apache License 2.0
1.59k stars 107 forks source link

Ollama share models between host and container #129

Closed shenhai-ran closed 1 month ago

shenhai-ran commented 2 months ago

Hello, I notice that ~/.ollama on host is mounted to /root/.ollama in the container, however from container I can't access the models have been downloaded by the host, is that on purpose?

Anything I can do to reuse the models from the host?

Thanks!

av commented 1 month ago

Hi, the idea is definitely to share the models with the host, the current default location is from the Ollama's defaults on MacOS

The path is configurable and you can point it to your actual cache path like this:

harbor config set ollama.cache /usr/share/ollama/.ollama

# Test: 
# start ollama
harbor up
# check models
harbor ollama ls

Edit: updated Ollama service docs to be more explicit about this

shenhai-ran commented 1 month ago

Thanks! It works on Linux