Open DenisKochetov opened 1 month ago
Cant fit into 24gb vram. This is sd1.5, why is it so big?
Enable xformers may solve your problem (https://github.com/OpenTexture/Paint3D/blob/main/controlnet/diffusers_cnet_txt2img.py#L23)
Same problem here. Installing xformers and enabling it leads to problems with kaolin because the torch version gets updated to 2.3. I can't find a xformers version that support pytorch 1.12.
But setting the environment variable PYTORCH_CUDA_ALLOC_CONF to 512 MB (https://stackoverflow.com/questions/73747731/runtimeerror-cuda-out-of-memory-how-can-i-set-max-split-size-mb ) did the job for me. I am now able to run both demo objects on a 24gb gpu