rinongal / textual_inversion

MIT License
2.87k stars 278 forks source link

Cuda ran out of memory while running main.py #155

Open PremHCLTech opened 1 year ago

PremHCLTech commented 1 year ago
image

getting cuda out of memory error, I even tried decreasing the batchsize but didn't work. Any help would be appriciated.

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 150.00 MiB (GPU 0; 15.90 GiB total capacity; 3.58 GiB already allocated; 120.25 MiB free; 3.67 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF