huawei-noah / Efficient-AI-Backbones

Efficient AI Backbones including GhostNet, TNT and MLP, developed by Huawei Noah's Ark Lab.
4.07k stars 708 forks source link

您好,请问CMT有预训练模型吗 #151

Open ecjtujemmy opened 2 years ago

iamhankai commented 2 years ago

https://github.com/huawei-noah/Efficient-AI-Backbones/tree/master/cmt_pytorch#cmt-on-imagenet-1k-classification

ecjtujemmy commented 2 years ago

谢谢您的解答,可是我在加载预模型时又出现问题,报错“No pretrained weights exist for this model. Using random initialization.”您知道是怎么回事吗 def cmt_b(pretrained=False, kwargs): """ CMT-Base """ model_kwargs = dict( qkv_bias=True, embed_dims=[76, 152, 304, 608], stem_channel=38, num_heads=[1, 2, 4, 8], depths=[4, 4, 20, 4], mlp_ratios=[4, 4, 4, 4], qk_ratio=1, sr_ratios=[8, 4, 2, 1], dp=0.3, kwargs) model = _create_cmt_model(pretrained=pretrained, **model_kwargs) if pretrained: checkpoint = torch.load('D:\preweight\cmt_base.pth') # todo pass path as argument model.load_state_dict(checkpoint, strict=False) print("load CMT pretrained") return model

ecjtujemmy commented 2 years ago

非常感谢!

在 2022-10-11 18:59:05,"Kai Han" @.***> 写道:

https://github.com/huawei-noah/Efficient-AI-Backbones/tree/master/cmt_pytorch#cmt-on-imagenet-1k-classification

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>