blib-la / runpod-worker-comfy

ComfyUI as a serverless API on RunPod
GNU Affero General Public License v3.0
301 stars 205 forks source link

Download models on the fly #60

Open TimPietrusky opened 3 months ago

TimPietrusky commented 3 months ago

Is your feature request related to a problem? Please describe. The Docker images are very huge and users would like to just use the base image (which doesn't contain any models). This would make it more convenient for them to work on the image and add things, without the need to upload a huge image all the time when they make changes.

Describe the solution you'd like

Describe alternatives you've considered

Additional context This idea came to life because of https://discord.com/channels/912829806415085598/1273963578369642557 and https://discord.com/channels/912829806415085598/1270792081580753047.

jelling commented 2 months ago

It's been awhile, but I recall trying this via a custom image I built. And the long initial download times caused the Runpod serverless process to restart the worker repeatedly... which then tried to download the image. Perhaps that's been changed, dunno.

A simpler, and more reliable way to accomplish your goal of a smaller image: mount your network volume using a CPU-only / cheap GPU pod, download the models to it, and then use the models from this image. But keep in mind that loading models from the NAS is much slower than on-disk.