stavsap / comfyui-ollama

Apache License 2.0
372 stars 34 forks source link

keep_alive need a '-1' option (keep alive forever) #59

Open morgan55555 opened 2 weeks ago

morgan55555 commented 2 weeks ago

Own ollama server is working on single llm model without unloading. Running comfyui-ollama with keep_alive will cause server to unload a model every time. We need ability to skip this loads/unloads for better performance.

bigcat88 commented 1 week ago

just specify "-1", it will be adjusted to "-1m" which is keep loaded forever behavior on Ollama API

morgan55555 commented 1 week ago

Nope, it will change back to 0.

"keep_alive": ("INT", {"default": 5, "min": 0, "max": 60, "step": 5}),
bigcat88 commented 1 week ago

Your are right, ComfyUI frontend will reset it back, missed that.

But when using ComfyUI throw API endpoints it will work(specifying -1)

PR to fix this will be very simple, just change "min": 0 to "min": -1 and this will fix it for non-API ComfyUI mode.

stavsap commented 13 hours ago

i have add the support for this, please try.