comfyanonymous / ComfyUI

The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
https://www.comfy.org/
GNU General Public License v3.0
55.89k stars 5.89k forks source link

After the latest updates, Flux UNET does not unload, when switching fp8/fp16 modes. #4464

Closed JorgeR81 closed 2 months ago

JorgeR81 commented 2 months ago

Expected Behavior

Flux Unet fp8 model is unloaded from RAM after switching to fp16 in the node settings.

Actual Behavior

Both models will be kept on RAM ( or page file in my case ), and the generation will slow down to a crawl.

Steps to Reproduce

Debug Logs

.

Other

I'm on Windows 10

I don't know what commit caused this exactly, because I've been using the checkpoint version lately. But I remember, I had no issues with this before.

JorgeR81 commented 2 months ago

By the way, don't try to replicate this yourself, unless you have 64 GB RAM ( or preferably 128 GB ).

I only have 32 GB RAM, and it was quite stressful to my system.

My page file increased over 40 GB, the system slowed down, it took several minutes to cancel the queue in the Comfy UI.

JorgeR81 commented 2 months ago

I just realized I was using --force-fp32, at this time. That's the reason for the system slowdown, while loading UNET model, in fp16 mode