Closed function2-llx closed 3 months ago
These unexpected and missing weights do not affect the model performance as they do not participate in the inference process.
Thanks for the reply. But I would like to finetune the model, so I think it may be better to fully utilize the pre-trained weights?
If you want to finetune the model on conventional downstream datasets (e.g., COCO), it would be better to adopt the COCO model and config. If you want to finetune it on datasets with long-tail distribution, please email me to obtain the LVIS training codes.
I see, thanks!
Dear authors,
I tried testing with the config https://github.com/Sense-X/Co-DETR/blob/221f1a38767f9456f2efe7d2e7f23f69b9c7cf84/projects/configs/co_dino_vit/co_dino_5scale_lsj_vit_large_lvis.py and the corresponding checkpoint. When loading the checkpoint, it produces the following messages:
There are three parts of issues:
fed_loss_weight
inquery_head.loss_cls
andbbox_head.0.loss_cls
, which seems to be the loss weight for federated loss.backbone.rope_glb.*
are reported here, while keys related tobackbone.rope_win
are not.roi_head
.It seems that there may be a mismatch between the code of the repository and the weights. Could you please take a look at this?
Best regards