FrozenBurning / Text2Light

[SIGGRAPH Asia 2022] Text2Light: Zero-Shot Text-Driven HDR Panorama Generation
https://frozenburning.github.io/projects/text2light/
Other
593 stars 47 forks source link

GPU memory issue #20

Closed ghost closed 11 months ago

ghost commented 1 year ago

Whenever I try running a 4K image, I get this error

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 18.00 GiB (GPU 0; 24.00 GiB total capacity; 4.58 GiB already allocated; 142.00 MiB free; 22.57 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

I have 3090 with 24GB ram. Can you please suggest a solution for this?

FrozenBurning commented 1 year ago

Thanks for your interest in our work. This is weird to me. Our model can be hosted on Colab with GPU that has smaller capacity than 3090. Could you share the launch script to reproduce? I can test on my side.

FrozenBurning commented 11 months ago

Close due to inactivity. Feel free to reopen it!