Open notBradPitt opened 2 weeks ago
In my test, it required ~ 17GB of VRAM with fp16
dtype and 512x512 resolution.
If you've failed with 9x16, then something is wrong. I guess you're facing a different error, not a CUDA error.
Why does it use more RAM that it does VRAM?
You should check if you are enabled to use CUDA.
import torch
print(torch.cuda.is_available())
How much RAM do I need to run this node? I'm using 16GB RAM and 16GB VRAM and still couldn't get a result.
RAM usage reaches 96% before ComfyUI crashes at the last stage (
muse_pose
). Even with height and width set to 9x16 (literally nine and sixteen) and it still doesn't work. Why does it use more RAM that it does VRAM? Did I set it up incorrectly?