amirbar / DETReg

Official implementation of the CVPR 2022 paper "DETReg: Unsupervised Pretraining with Region Priors for Object Detection".
https://amirbar.net/detreg
Apache License 2.0
336 stars 46 forks source link

Question about the lr_drop in DETR based experiment #41

Closed 4-0-4-notfound closed 2 years ago

4-0-4-notfound commented 2 years ago

It seems the lr_drop is missing in the pretraining and finetune script. The default lr_drop for DETR is 200, but the pretraining is only 60 epochs, so lr_drop is possible missing.

amirbar commented 2 years ago

I think the defaults should be ok. For DETReg using Deformable-DETR on IN1k I did not try lr dropping (just 5e pretraining). Similarly, for DETReg using DETR on IN I didn't lr drop. On IN100 where I reported some results on the paper, I've lr dropped after 40e and train for 50e.