Closed BestSonny closed 3 years ago
Hi Pan,
Thanks for pointing that out! That bug doesn’t occur if we use the spawn start method like the following:
from torch.multiprocessing import Pool, Process, set_start_method
try:
set_start_method('spawn')
except RuntimeError:
pass
For all experiments conducted in the paper, we use the spawn setting. We will fix the bug in the released code soon.
Thanks, Zaiwei
@zaiweizhang Great. Looking forward to your updated code and results. Thanks
So calling main with --multiprocessing-distributed
solves the problem in this case, correct?
Actually, you also need to change the setting to spawn. We have changed it in the main.py. Please try again! Sorry for the trouble.
Ok thanks for the information. Does this bug occur across all pytorch frameworks? For example, does this happen when finetuning with OpenPCDet?
@baraujo98 As mentioned, this is a common bug that impacts a lot of PyTorch frameworks. A blog has discussed this as well here.
"The bug is easy to make. In some cases, it has minimal effect on final performance. In others, the identical augmentations can cause severe degradations."
@baraujo98 I do not think that it will affect the fine-tuning with OpenPCDet. It's better to check their codebase.
Ok, it looks to me like OpenPCDet does not set a fixed seed by default, hinted by the fact that a --fix_random_seed
argument exists, and it is False by default. Is that enough to be safe or should I make sure the seed is being reset after each epoch? Thanks for the heads up @BestSonny and for the fast correction @zaiweizhang
Feel free to reopen. Closing it for now.
Dear authors,
I think the repo is an excellent implementation of your excellent paper. However, it shares a common bug in setting the seed number for data augmentation as pointed in here and here
I have tested in this repo and found out that each sample will only have two one-time augmented versions across the whole training epochs, which is problematic to the self-contrastive learning setting.
You may want to revisit the experiments in the paper.
Best, Pan He