kijai / ComfyUI-PyramidFlowWrapper

MIT License
97 stars 4 forks source link

Any tips on how to run on a card with 11gb VRAM #13

Open Kvento opened 10 hours ago

Kvento commented 10 hours ago

I'm trying to run the model on my RTX 2080ti, but so far I haven't been able to succeed.

The video memory consumption is too high and when running with the standard workflow settings (loading the model on bf16) I get a crash and the message "Allocation on device". But if I enable the fp16 mode and model dtype fp8_e4m3fn then the model loads without crashes, but during the execution of the PyramidFlow Sampler block, the memory consumption is just crazy. This goes beyond the video card memory and Shared GPU Memory starts working and the processing time tends to infinity.

I read that others run the model with less memory consumption, so I think maybe I'm doing something wrong? I have already deleted and re-downloaded ComfyUI portable and the required nodes, but the result has not changed.

2024-10-11_21-22-38 2024-10-11_21-09-58

kijai commented 10 hours ago

It's very close to what it needs, maybe you have some other software take up VRAM in Windows? With my large monitor and having browser up with many tabs etc. Windows can take 1-3GB VRAM on it's own.

Kvento commented 9 hours ago

No, my standard system video memory consumption is 0.6 - 0.9Gb, so before launching ComfyUI I close any software or utilities that can affect video memory consumption.

2024-10-11_22-47-05

kijai commented 9 hours ago

That's odd then, with fp8 it's not even taking 10GB for me... I'm on torch 2.4.1 cu124, not sure what else could affect it.

rixjsjsbxh commented 8 hours ago

“2080ti doesn’t support BF16, so I can only use FP16, but FP16 throws NaN errors. I tried running it with FP8, but the memory usage is outrageous. Even with 22GB of VRAM, it still runs out of memory.” ![Uploading a805673598f147b74e6d7c3bd8fc8702.png…]()

rixjsjsbxh commented 8 hours ago

a805673598f147b74e6d7c3bd8fc8702 a805673598f147b74e6d7c3bd8fc8702