Closed Qjizhi closed 2 years ago
my device is 3080 16GB, but with the original code, I can only train with batch_size=1, otherwise, cuda out of memory.
I tried with Mixed Precision, and I can increase batch_size to 2, how can I train with large batch size without dropping accuracy.
As mentioned in the mail, we used multiple GPUs to enable bigger batch sizes.
my device is 3080 16GB, but with the original code, I can only train with batch_size=1, otherwise, cuda out of memory.
I tried with Mixed Precision, and I can increase batch_size to 2, how can I train with large batch size without dropping accuracy.