Closed CSU-NXY closed 1 year ago
If you decrease the batch size, you should increase the number of iterations and decrease the learning rate according to the linear scaling rule.
@CSU-NXY I'm same question as you, did you find the way of configuration for multiple GPU?
Hi, I'm trying to train MipNeRF360 with 4 GPUs. Should I remain the batch_size untouched or multiply 4 to it? Is there any other changes I should do to support multi-GPU training?
Thanks!