Closed geetumolbabu91 closed 3 years ago
The model was trained with 92 classes. However, your current model only has 3 classes. Your can remove the class_embed
keys in the state dict
I have the opposite situation.
RuntimeError: Error(s) in loading state_dict for UPDETR: size mismatch for class_embed.weight: copying a param with shape torch.Size([3, 256]) from checkpoint, the shape in current model is torch.Size([92, 256]). size mismatch for class_embed.bias: copying a param with shape torch.Size([3]) from checkpoint, the shape in current model is torch.Size([92]).
How to fix it?
RuntimeError: Error(s) in loading state_dict for UPDETR: size mismatch for class_embed.weight: copying a param with shape torch.Size([92, 256]) from checkpoint, the shape in current model is torch.Size([3, 256]). size mismatch for class_embed.bias: copying a param with shape torch.Size([92]) from checkpoint, the shape in current model is torch.Size([3]).