gligen / GLIGEN

Open-Set Grounded Text-to-Image Generation
MIT License
1.91k stars 145 forks source link

CUDA out of memory #55

Open bawat opened 9 months ago

bawat commented 9 months ago

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 11.00 GiB total capacity; 10.15 GiB already allocated; 0 bytes free; 10.33 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

From looking at this issue it seems like the model needs ~16gb VRAM to run, my GTX 1080 ti only has 11gb with very little else consuming it.

Sun Oct 8 13:38:46 2023 +---------------------------------------------------------------------------------------+ | NVIDIA-SMI 530.50 Driver Version: 531.79 CUDA Version: 12.1 | |-----------------------------------------+----------------------+----------------------+ | 0 NVIDIA GeForce GTX 1080 Ti On | 00000000:01:00.0 On | N/A | | 19% 58C P0 65W / 280W| 644MiB / 11264MiB | 0% Default |

As suggested when googling the issue, I have tried reducing the batch size. python gligen_inference.py --batch_size 1 But I am still running out of memory, does anyone have any ideas or is what I am attempting not possible?

CREED404 commented 7 months ago

same problem here, do you get the same problem when using the GLIGEN demo which uses xformers?