WilliamKarolDiCioccio / open_local_ui

OpenLocalUI: Native desktop app for Windows, MacOS and Linux. Easily run Large Language Models locally, no complex setups required. Inspired by OpenWebUI's simplicity for LLM use.
MIT License
16 stars 2 forks source link

Model download scheduler #11

Open WilliamKarolDiCioccio opened 1 month ago

WilliamKarolDiCioccio commented 1 month ago

I think it would be nice to have a simple scheduling system for downloading Ollama models. Moreover it would be a good opportunity to add a progress bar at the bottom of the models management page for downloads in background.

WilliamKarolDiCioccio commented 2 weeks ago

@Rossi1337 I was thinking of system to queue each task and show at the bottom of the models page the currently active task. With task I mean both pull and push requests. Do you have time to do that? Meanwhile I'm working on the updater feature