PeidongLi / DualBEV

[ECCV 2024] The official implementation of DualBEV
Apache License 2.0
36 stars 2 forks source link

Poor Performance about 'dualbev-4d-r50-cbgs' #4

Open Mingqj opened 3 weeks ago

Mingqj commented 3 weeks ago

Great work! I have re-implemented the dualbev-4d-r50-cbgs using 4 RTX 3090 GPUs, but get poor performance. The only change in cfgs is that I reduce the original learning rate by half, as my total batch size is 8 * 4=32. I also observe that the model converges by the 15th epoch. Could you give me some suggestions? Here are my re-implementation results: epoch 12: NDS: 48.90, mAP: 37.31 epoch 13: NDS: 49.14, mAP: 37.33 epoch 14: NDS: 49.37, mAP: 37.63 epoch 15: NDS: 49.67, mAP: 37.71 epoch 16: NDS: 49.54, mAP: 37.96 epoch 17: NDS: 49.61, mAP: 37.81 epoch 18: NDS: 49.66, mAP: 37.83 epoch 19: NDS: 49.54, mAP: 37.80 epoch 20: NDS: 49.45, mAP: 37.81

PeidongLi commented 1 week ago

Sorry we have not test with 4 GPUs, can you reproduce the result with our checkpoint?

Mingqj commented 1 week ago

Sorry we have not test with 4 GPUs, can you reproduce the result with our checkpoint?

Thanks for your reply! I can reproduce the results with the released checkpoint, and I have gotten similar results (NDS:50.61 mAP:38.32) when using a larger learning rate (2e-4) under 4 GPUs.