Hi! @wkcn
Thanks for your contribution. When I load pretrain model (mini_deit_small_patch16_224.pth), it is something wrong:
size mismatch for pos_embed: copying a param with shape torch.Size([1, 196, 384]) from checkpoint, the shape in current model is torch.Size([1, 197, 384]).
I looked at the code and found that the code used cls_token and the loaded model did not.
Hi! @wkcn Thanks for your contribution. When I load pretrain model (mini_deit_small_patch16_224.pth), it is something wrong:
size mismatch for pos_embed: copying a param with shape torch.Size([1, 196, 384]) from checkpoint, the shape in current model is torch.Size([1, 197, 384]).
I looked at the code and found that the code used cls_token and the loaded model did not.