Closed lvxuanxuan123 closed 2 years ago
The samples_per_gpu is the batch_size on per GPU. When adjusting the batch size, please adjust the learning rate accordingly. https://github.com/zhiqi-li/BEVFormer/blob/846fb5bcf0ff746f4f50dfa436c7ccbf7e3122bd/projects/configs/bevformer_fp16/bevformer_tiny_fp16.py#L211
samples_per_gpu
The samples_per_gpu is the batch_size on per GPU. When adjusting the batch size, please adjust the learning rate accordingly.
https://github.com/zhiqi-li/BEVFormer/blob/846fb5bcf0ff746f4f50dfa436c7ccbf7e3122bd/projects/configs/bevformer_fp16/bevformer_tiny_fp16.py#L211
感谢李博士的工作!
the issue has been solved
The
samples_per_gpu
is the batch_size on per GPU. When adjusting the batch size, please adjust the learning rate accordingly. https://github.com/zhiqi-li/BEVFormer/blob/846fb5bcf0ff746f4f50dfa436c7ccbf7e3122bd/projects/configs/bevformer_fp16/bevformer_tiny_fp16.py#L211