Closed 4-0-4-notfound closed 2 years ago
I think the defaults should be ok. For DETReg using Deformable-DETR on IN1k I did not try lr dropping (just 5e pretraining). Similarly, for DETReg using DETR on IN I didn't lr drop. On IN100 where I reported some results on the paper, I've lr dropped after 40e and train for 50e.
It seems the
lr_drop
is missing in the pretraining and finetune script. The defaultlr_drop
for DETR is 200, but the pretraining is only 60 epochs, solr_drop
is possible missing.