balazik / ComfyUI-PuLID-Flux

PuLID-Flux ComfyUI implementation
Apache License 2.0
395 stars 28 forks source link

Question about VRAM and RAM requirements #13

Closed jojotaro1994 closed 1 month ago

jojotaro1994 commented 1 month ago

Hi, I would like to know if this implementation has high requirements for system RAM in addition to VRAM. For example, is there a specific amount of RAM needed for it to work properly? Thank you very much for your response!

balazik commented 1 month ago

Hi, I never measured the RAM requirements. Its relatively hard to do, because depends on what Flux model and what workflow you use. For example simple workflow with KSampler and flux1-dev-fp8-e4m3fn, t5xxl_fp8_e4m3fn, flux1_vae and pulid flux loaded as 16bit bfloat takes up ~11 to ~16GB RAM (the range because ComfyUI can load and unload models when generating).

But I recomend at least 32GB RAM when you want to play with ComfyUI else it will swap a lot. There are also many argument flags that you can use to save memory.

Orenji-Tangerine commented 1 month ago

My system RAM peaks at 20GB while VRAM peaks at 15+GB during first run and it takes 80 seconds for a 960x1280 image @ 10 steps, Euler + Simple scheduler. My system build: 4060Ti 16GB + 64GB DDR4, Pytorch 2.1.2 + CU 11.8.

balazik commented 1 month ago

No other questions, closing