Open zouberou-sayibou opened 10 months ago
i have a similar problem, my GPU is GeForce 4050, with a total capacity of 6GiB, i am not sure if our problems are the same type
OutOfMemoryError: CUDA out of memory. Tried to allocate 1.36 GiB. GPU 0 has a total capacty of 5.77 GiB of which 284.31 MiB is free. Including non-PyTorch memory, this process has 5.48 GiB memory in use. Of the allocated memory 5.34 GiB is allocated by PyTorch, and 22.94 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
maybe i should somehow tell seamless to do it slowly?
idem,but with the medium model and using it just after turning on the PC it works
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 318.00 MiB. GPU 0 has a total capacty of 6.00 GiB of which 0 bytes is free. Including non-PyTorch memory, this process has 17179869184.00 GiB memory in use. Of the allocated memory 5.00 GiB is allocated by PyTorch, and 271.88 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
I got this error but it is Rrdiculous the amount of GB required by the code, is anyone having the same problem?