amazon-science / omni-detr

PyTorch implementation of Omni-DETR for omni-supervised object detection: https://arxiv.org/abs/2203.16089
Other
64 stars 6 forks source link

How many epochs do I need to train in SSOD mode #3

Closed 1224wxwx closed 2 years ago

1224wxwx commented 2 years ago

Hi thanks for sharing your great project!

If I wanner reproduce the SSOD result (in Table 3), could you please tell me how to set the epochs? Will it still be 150, just like you wrote in r50_ut_detr_omni_coco.sh?

I’m looking forward to hearing from you.

peiwang062 commented 2 years ago

Hi, could you try these settings?

splitting 1% 2% 5% 10%
#Burn-in epoch 400 100 40 20
#Total epoch 1000 800 400 200
1224wxwx commented 2 years ago

Hi, could you try these settings?

splitting 1% 2% 5% 10%

Burn-in epoch 400 100 40 20

Total epoch 1000 800 400 200

Thank you for your replay! It seems that the epochs are much more than the original settings in unbiased teacher. Does it means that DeformDetr is much more difficult to train than anchor-based model?

peiwang062 commented 2 years ago

For unbiased teacher, 180000 iterations, batch size is 32 for labeled and unlabeled. On 10% setting, if we compute the epoch from the perspective of labeled samples, it is around (180000 32)/(118287 0.1) = 487 epochs. If from the perspective of unlabeled samples, around (180000 32)/(118287 0.9) = 54 epochs. Our method by Deformable DETR needs 200 epochs. So it is hard to say which one is longer.

1224wxwx commented 2 years ago

OK, thanks