dddzg / up-detr

[TPAMI 2022 & CVPR2021 Oral] UP-DETR: Unsupervised Pre-training for Object Detection with Transformers
Apache License 2.0
476 stars 71 forks source link

Error in notebook while loading up-detr-coco-fine-tuned-300ep.pth #13

Closed geetumolbabu91 closed 3 years ago

geetumolbabu91 commented 3 years ago

RuntimeError: Error(s) in loading state_dict for UPDETR: size mismatch for class_embed.weight: copying a param with shape torch.Size([92, 256]) from checkpoint, the shape in current model is torch.Size([3, 256]). size mismatch for class_embed.bias: copying a param with shape torch.Size([92]) from checkpoint, the shape in current model is torch.Size([3]).

dddzg commented 3 years ago

The model was trained with 92 classes. However, your current model only has 3 classes. Your can remove the class_embed keys in the state dict

liuchengying758650786 commented 7 months ago

I have the opposite situation.

RuntimeError: Error(s) in loading state_dict for UPDETR: size mismatch for class_embed.weight: copying a param with shape torch.Size([3, 256]) from checkpoint, the shape in current model is torch.Size([92, 256]). size mismatch for class_embed.bias: copying a param with shape torch.Size([3]) from checkpoint, the shape in current model is torch.Size([92]).

How to fix it?