justinpinkney / stable-diffusion

MIT License
1.46k stars 269 forks source link

RuntimeError: CUDA out of memory on 12Gb card #61

Closed ostap667inbox closed 1 year ago

ostap667inbox commented 1 year ago

I'm trying Image Mixer (gradio_image_mixer.py). After clicking 'Generate' I get this error:

RuntimeError: CUDA out of memory. Tried to allocate 4.88 GiB (GPU 0; 12.00 GiB total capacity; 7.48 GiB already allocated; 1.14 GiB free; 7.83 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

Any idea how to make this work on an RTX3060 12Gb?

justinpinkney commented 1 year ago

yeah, need bigger vram. look at the Diffusers library for better low memory support