XLabs-AI / x-flux-comfyui

Apache License 2.0
1.1k stars 72 forks source link

[bug]Outofmemory error reported when using xlabs sampler #128

Open th10010 opened 1 month ago

th10010 commented 1 month ago

My video memory is 8GB, and I use comfyui's FLUX to generate images,when I use the xlabs sampler,before proceeding to the next step, an out of memory error immediately popped up, causing the interruption of image generation. What is the problem? Is my computer configuration low? If so, what are the minimum configuration requirements? Thank you for your response. 2 3

MrBabai commented 1 month ago

Xlabs sampler and other Xlabs nodes do not support offloading to RAM. So you need GPU memory for complete load of model, loras etc. You either need to use cards with lot of GPU memory, like 24GB ones or try to use quantized models that will fit into smaller cards. Or maybe someone will implement offloading like usual comfy nodes that can work with RAM.

Edisson75AiResources commented 1 month ago

Thanks a lot

Xlabs sampler and other Xlabs nodes do not support offloading to RAM. So you need GPU memory for complete load of model, loras etc. You either need to use cards with lot of GPU memory, like 24GB ones or try to use quantized models that will fit into smaller cards. Or maybe someone will implement offloading like usual comfy nodes that can work with RAM.

Thanks a lot for your answer MrBabai. I was totally lost with the same error until your comment.