shermanlian / spatial-entropy-loss

Equipping Diffusion Models with Differentiable Spatial Entropy for Low-Light Image Enhancement, CVPRW 2024. Best LPIPS in NTIRE chanllenge.
https://arxiv.org/abs/2404.09735
MIT License
12 stars 1 forks source link

torch.cuda.OutOfMemoryError: CUDA out of memory #3

Open liujun0621 opened 1 month ago

liujun0621 commented 1 month ago

when i use 2080 or 4090 GPU,trainning error as follow:

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 3.00 GiB (GPU 0; 10.75 GiB total capacity; 5.97 GiB already allocated; 2.42 GiB free; 8.08 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

webwxgetmsgimg

thanks very much

Algolzw commented 3 weeks ago

Hi! The proposed loss indeed requires large memories (we use an A100 GPU with 40GB in training). In practice, you can use a small patch size or change the pixel_level to smaller values (16 or 8) for computational efficiency.