Open th10010 opened 1 month ago
Xlabs sampler and other Xlabs nodes do not support offloading to RAM. So you need GPU memory for complete load of model, loras etc. You either need to use cards with lot of GPU memory, like 24GB ones or try to use quantized models that will fit into smaller cards. Or maybe someone will implement offloading like usual comfy nodes that can work with RAM.
Thanks a lot
Xlabs sampler and other Xlabs nodes do not support offloading to RAM. So you need GPU memory for complete load of model, loras etc. You either need to use cards with lot of GPU memory, like 24GB ones or try to use quantized models that will fit into smaller cards. Or maybe someone will implement offloading like usual comfy nodes that can work with RAM.
Thanks a lot for your answer MrBabai. I was totally lost with the same error until your comment.
My video memory is 8GB, and I use comfyui's FLUX to generate images,when I use the xlabs sampler,before proceeding to the next step, an out of memory error immediately popped up, causing the interruption of image generation. What is the problem? Is my computer configuration low? If so, what are the minimum configuration requirements? Thank you for your response.