ai-dock / comfyui

ComfyUI docker images for use in GPU cloud and local environments. Includes AI-Dock base for authentication and improved user experience.
Other
650 stars 224 forks source link

micromamba environment and ComfyUI both 20+ GB #28

Closed ionflow closed 3 months ago

ionflow commented 10 months ago

I've built the latest and pushed to docker hub and run it on runpod with all the models comment out and just 3 nodes. And I'm using a network volume on runpod for all the models I need. I have a 50GB volume and it's 95% full. Both ComfyUI and micromamba are coming in at around 20-25GB. I have my models pulled down from my network volume, so it makes sense that ComfyUI is that big, but does micromamba have to be that big?

Screenshot 2024-01-07 at 3 17 22 PM
robballantyne commented 10 months ago

Runpod has an issue with small file sizes on their volume storage. You'll notice that the size on /workspace is much larger than the size on /opt before the move. I've raised it but it won't be fixed as it's a known filesystem feature.

If you're building the image with your own models and nodes it's probably not worth syncing at all. You could skip the sync (WORKSPACE_SYNC=false) and replace the output directory with a symlink to the workspace.

Models should be places in /workspace/storage and will automatically link to the correct locations.

ionflow commented 10 months ago

I see. Will /input, /output, and /models all be linked if I cloud sync from AWS to /workspace/storage?

robballantyne commented 8 months ago

Models will be if the paths match up to the mappings.sh described in the docs but there's no mechanism for linking input/output. You'd have to sync directly into the ComfyUI directory.