Open 88plug opened 1 month ago
For various reasons we believe it's unwise to allow self-updating of WebUI itself within it's container, as this somewhat goes against the point of using Docker to begin with. However, given that Ollama is a self-contained binary anyway, is not overly large (~300MB download give or take), and we already allow updating of things like RAG and STT models, I believe we could also have an exception for allowing the Ollama-included image update the Ollama install on startup if enabled with a variable OLLAMA_BINARY_AUTO_UPDATE=True
.
When building the Dockerfile from scratch using USE_CUDA=true and USE_OLLAMA=true I was succesful to update the bundled version. Is there a way the team can create a build pipeline? That seems the easiest way to keep the bundled Ollama version up-to-date.
There is a build pipeline, but currently it only runs when we update our code, not when Ollama updates theirs.
There is a build pipeline, but currently it only runs when we update our code, not when Ollama updates theirs.
If possible please update to include the bundled version, which is the topic here. That would be better than what we have now - which appears to be a static build. (not pipeline).
We have another plan.
Is there a way the team can create a build pipeline? That seems the easiest way to keep the bundled Ollama version up-to-date.
the ollama version is determined at build time. it's not exactly "statically" bundled. the only way it can be updated post-build is by making those changes when the container starts up, which will significantly increase your startup time.
I think if you want to have your ollama version be up to date, its probably best to have Ollama and Open Web UI as separate containers, where you can control their running version.
I suppose using the dev-ollama
image is also an option, as this gets updated a lot more frequently than ollama
, but could be unstable.
OLLAMA_BINARY_AUTO_UPDATE=True I think is a great solution!
Is your feature request related to a problem? Please describe. When I run Installing Open WebUI with Bundled Ollama Support, the Ollama version is out of date and not compatible with new models like mistral-nemo (requires Ollama 0.2.8) and more.
docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
Latest Ollama Version as of writing is 0.3.0
Describe the solution you'd like To improve this, I suggest updating the bundled Ollama version to the latest stable release and implementing an automated upgrade system. Here's a proposed plan:
Bundle Latest Ollama: Include the most recent stable Ollama version in the installation package for immediate compatibility with new models.
Automated Upgrade System:
Describe alternatives you've considered Using Ollama on the host machine will solve this, but new users will find models don't work without latest version.
Additional context Love Love Love what you are doing!