raoyongming / GFNet

[NeurIPS 2021] [T-PAMI] Global Filter Networks for Image Classification
https://gfnet.ivg-research.xyz/
MIT License
445 stars 41 forks source link

GFNet models pretrained on ImageNet cannot be loaded #33

Closed qm-intel closed 8 months ago

qm-intel commented 8 months ago

@raoyongming Thanks for providing the ImageNet weights. I was trying to load the weights that you have shared to do some analysis for benchmarking, however, when I am loading it for gfnet-ti or gfnet-xs, I am getting the following error:

model.load_state_dict(torch.load(weightpath))

RuntimeError: Error(s) in loading state_dict for GFNet:
        Missing key(s) in state_dict: "pos_embed", "patch_embed.proj.weight", "patch_embed.proj.bias", "blocks.0.norm1.weight", "blocks.0.norm1.bias", "blocks.0.filter.complex_weight", "blocks.0.norm2.weight", "blocks.0.norm2.bias", "blocks.0.mlp.fc1.weight", "blocks.0.mlp.fc1.bias", "blocks.0.mlp.fc2.weight", "blocks.0.mlp.fc2.bias", "blocks.1.norm1.weight", "blocks.1.norm1.bias", "blocks.1.filter.complex_weight", "blocks.1.norm2.weight", "blocks.1.norm2.bias", "blocks.1.mlp.fc1.weight", "blocks.1.mlp.fc1.bias", "blocks.1.mlp.fc2.weight", "blocks.1.mlp.fc2.bias", "blocks.2.norm1.weight", "blocks.2.norm1.bias", "blocks.2.filter.complex_weight", "blocks.2.norm2.weight", "blocks.2.norm2.bias", "blocks.2.mlp.fc1.weight", "blocks.2.mlp.fc1.bias", "blocks.2.mlp.fc2.weight", "blocks.2.mlp.fc2.bias", "blocks.3.norm1.weight", "blocks.3.norm1.bias", "blocks.3.filter.complex_weight", "blocks.3.norm2.weight", "blocks.3.norm2.bias", "blocks.3.mlp.fc1.weight", "blocks.3.mlp.fc1.bias", "blocks.3.mlp.fc2.weight", "blocks.3.mlp.fc2.bias", "norm.weight", "norm.bias", "head.weight", "head.bias". 
        Unexpected key(s) in state_dict: "model".

Are these weights for the same architecture in code? how can I resolve this? Thanks

raoyongming commented 8 months ago

Hi, please refer to the method here to load pre-trained models.

qm-intel commented 8 months ago

@raoyongming Thanks a lot.