Closed ghost closed 11 months ago
Thanks for your interest in our work. This is weird to me. Our model can be hosted on Colab with GPU that has smaller capacity than 3090. Could you share the launch script to reproduce? I can test on my side.
Close due to inactivity. Feel free to reopen it!
Whenever I try running a 4K image, I get this error
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 18.00 GiB (GPU 0; 24.00 GiB total capacity; 4.58 GiB already allocated; 142.00 MiB free; 22.57 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
I have 3090 with 24GB ram. Can you please suggest a solution for this?