Closed houxushi closed 2 years ago
We used the pretrained model released by the authors of In-place ABN (as said in the paper), that can be found in their GitHub project: https://github.com/mapillary/inplace_abn.
Since the pretrained are made on multiple-gpus, they contain a prefix "module." in each key of the network. Please, be sure to remove them to be compatible with this code (simply rename them using key = key[7:]). If you don't want to use pretrained, please use --no-pretrained.
this "module" and key I don't understand and I don't know how to change it in this code,could you tell me the detail?how can I rename them,this"key = key[7:]"I don't know how to apply them,Sorry to disturb you
Oh sure, sorry!
So, basically you can have a script similar to this:
model = torch.load("the checkpoint path") state = {} for k, v in checkpoint.items(): state[k[7:]] = v torch.save(state, "the destination path")
The destination path should be something like pretrained/{opts.backbone}_{opts.norm_act}.pth.tar and it'll be automatically loaded by the code.
Hope it's clearer now :)
Oh sure, sorry!
So, basically you can have a script similar to this:
model = torch.load("the checkpoint path") state = {} for k, v in checkpoint.items(): state[k[7:]] = v torch.save(state, "the destination path")
The destination path should be something like pretrained/{opts.backbone}_{opts.norm_act}.pth.tar and it'll be automatically loaded by the code.
Hope it's clearer now :)
Thank you very much for taking the time to answer my question!I will study your reply carefully, good luck!:)
We used the pretrained model released by the authors of In-place ABN (as said in the paper), that can be found in their GitHub project: https://github.com/mapillary/inplace_abn.
Since the pretrained are made on multiple-gpus, they contain a prefix "module." in each key of the network. Please, be sure to remove them to be compatible with this code (simply rename them using key = key[7:]). If you don't want to use pretrained, please use --no-pretrained.