open-webui / open-webui

User-friendly WebUI for LLMs (Formerly Ollama WebUI)
https://openwebui.com
MIT License
39.5k stars 4.62k forks source link

Seamless Ollama Upgrades for Open WebUI #4168

Open 88plug opened 1 month ago

88plug commented 1 month ago

Is your feature request related to a problem? Please describe. When I run Installing Open WebUI with Bundled Ollama Support, the Ollama version is out of date and not compatible with new models like mistral-nemo (requires Ollama 0.2.8) and more.

docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama

image Latest Ollama Version as of writing is 0.3.0

Describe the solution you'd like To improve this, I suggest updating the bundled Ollama version to the latest stable release and implementing an automated upgrade system. Here's a proposed plan:

  1. Bundle Latest Ollama: Include the most recent stable Ollama version in the installation package for immediate compatibility with new models.

  2. Automated Upgrade System:

    • Develop a background task that periodically checks (e.g., once weekly) for updates against the official Ollama repository.
    • Notify user via an in-app notification when an update is available, with an option to upgrade.
    • Automate the download and application of updates upon user approval to minimize manual intervention.

Describe alternatives you've considered Using Ollama on the host machine will solve this, but new users will find models don't work without latest version.

Additional context Love Love Love what you are doing!

justinh-rahb commented 1 month ago

For various reasons we believe it's unwise to allow self-updating of WebUI itself within it's container, as this somewhat goes against the point of using Docker to begin with. However, given that Ollama is a self-contained binary anyway, is not overly large (~300MB download give or take), and we already allow updating of things like RAG and STT models, I believe we could also have an exception for allowing the Ollama-included image update the Ollama install on startup if enabled with a variable OLLAMA_BINARY_AUTO_UPDATE=True.

88plug commented 1 month ago

When building the Dockerfile from scratch using USE_CUDA=true and USE_OLLAMA=true I was succesful to update the bundled version. image Is there a way the team can create a build pipeline? That seems the easiest way to keep the bundled Ollama version up-to-date.

justinh-rahb commented 1 month ago

There is a build pipeline, but currently it only runs when we update our code, not when Ollama updates theirs.

88plug commented 1 month ago

There is a build pipeline, but currently it only runs when we update our code, not when Ollama updates theirs.

If possible please update to include the bundled version, which is the topic here. That would be better than what we have now - which appears to be a static build. (not pipeline).

justinh-rahb commented 1 month ago

We have another plan.

thearyadev commented 1 month ago

Is there a way the team can create a build pipeline? That seems the easiest way to keep the bundled Ollama version up-to-date.

the ollama version is determined at build time. it's not exactly "statically" bundled. the only way it can be updated post-build is by making those changes when the container starts up, which will significantly increase your startup time.

I think if you want to have your ollama version be up to date, its probably best to have Ollama and Open Web UI as separate containers, where you can control their running version.

I suppose using the dev-ollama image is also an option, as this gets updated a lot more frequently than ollama, but could be unstable.

88plug commented 1 month ago

OLLAMA_BINARY_AUTO_UPDATE=True I think is a great solution!