Thank you for codes, it help me lot
Why CUDA still out of memory, when I set batch_size equal to 1,num_workers=0,gpu=[0,1,2,3](4 GPUs)?
I have tried it many times ,even 6 GPUs. and I set the multi_gpu_mode = ddp_spawn/ddp/dp?
If you have any advices , please tell me , Thank you very much.
Thank you for codes, it help me lot Why CUDA still out of memory, when I set batch_size equal to 1,num_workers=0,gpu=[0,1,2,3](4 GPUs)? I have tried it many times ,even 6 GPUs. and I set the multi_gpu_mode = ddp_spawn/ddp/dp?
If you have any advices , please tell me , Thank you very much.