Closed M-Abdallah closed 4 years ago
same thing: RuntimeError: CUDA out of memory. Tried to allocate 16.00 MiB (GPU 0; 5.77 GiB total capacity; 4.51 GiB already allocated; 15.75 MiB free; 84.29 MiB cached)
it 'll be helpful reducing BATCH_SIZE. and i recommended removing other processes used by gpu.
@ryujaehun have you tried batch size 1? I see no speed up using fp16 vs fp32.
@ryujaehun have you tried batch size 1? I see no speed up using fp16 vs fp32.
https://pytorch.org/docs/stable/bottleneck.html will help you
It clearly shows I still have over 1GB unallocated