Closed chakkritte closed 5 years ago
@billhhh @chakkritte YES,I have met with the same problem ,and i think your code is not completely,could you share your completely code,Thanks!
@chakkritte @huxianer The source code is complete, which is the one I used to train. Just figure out which model you guys have used, if using 299 input, plz change train.py input_size from 224 to 299
This problem I seems encountered before, maybe tried change the pretrained model name helps
@billhhh I just use my train dataset,when I run the script,the loss is not change,and the other display is incorrect
Because of dataparallel
Hi
@billhhh your pre-trained have a weight name is -> module.features.0.0.weight but in your source code is -> features.0.0.weight
I fixed this problem from https://discuss.pytorch.org/t/solved-keyerror-unexpected-key-module-encoder-embedding-weight-in-state-dict/1686/4 already.
@huxianer I think this solution will help you.
Put below source-code before load_state_dict
# original saved file with DataParallel
state_dict = torch.load('myfile.pth.tar')
# create new OrderedDict that does not contain `module.`
from collections import OrderedDict
new_state_dict = OrderedDict()
for k, v in state_dict.items():
name = k[7:] # remove `module.`
new_state_dict[name] = v
# load params
model.load_state_dict(new_state_dict)
Hello @billhhh
I used your code and pre-trained model to loading state_dict have to error,
pls tell me about your full source code of training.
thank you