Open chenjoya opened 6 years ago
@ChenJoya Hi,
I want to ask in Darknet, do i need to set batchsize = 8 for 4 GPUS to keep batchsize = 32?
Yes, you need to set batchsize = 8 for 4 GPUS to keep batchsize = 32.
Because if you set batch=64 subdivions=32 and train using 4 x GPUs then total batch_size will be 256 = 64*4 = batch * ngpus
Or the same mini_batch * subdivisions * ngpus = 2*32*4 = 256
: https://github.com/AlexeyAB/darknet/blob/17019854c33b60a76952494091726868e622fb2b/src/detector.c#L113
@chenjoya Hi,
I want to ask in Darknet, do i need to set batchsize = 8 for 4 GPUS to keep batchsize = 32?
Yes, you need to set batchsize = 8 for 4 GPUS to keep batchsize = 32.
Because if you set batch=64 subdivions=32 and train using 4 x GPUs then total batch_size will be
256 = 64*4 = batch * ngpus
Or the same
mini_batch * subdivisions * ngpus = 2*32*4 = 256
:
I have been looking for this question for a long time, and finally I see the answer here. Thank you for your work!
Hi AlexeyAB. My question is in batch-size in multi-gpus. In other Deep Learning frameworks, it seems that if your batchsize = 32 and you train in 4 GPUS, then each of GPU will only get 32 / 4 = 8 images, then do bp. I want to ask in Darknet, do i need to set batchsize = 8 for 4 GPUS to keep batchsize = 32? Thank you.