Closed jjccyy closed 6 years ago
只需要保证训练和预训练模型的LSTM隐藏层数保持一致即可,在crnn_main.py中if opt.crnn != '': print('loading pretrained model from %s' % opt.crnn) pre_trainmodel = torch.load(opt.crnn) model_dict = crnn.state_dict() weig1 = 'rnn.1.embedding.weight' bias1 = 'rnn.1.embedding.bias' if len(model_dict[weig1]) == len(pre_trainmodel[weig1]) and len(model_dict[bias1]) == len(pre_trainmodel[bias1]): crnn.load_state_dict(pre_trainmodel) else : for k,v in model_dict.items(): if (k != weig1 or k != bias1): model_dict[k] = pre_trainmodel[k] crnn.load_state_dict(model_dict) print(crnn)
已经处理相应的改变,如有疑问可联系我邮箱:676715563@qq.com @ @jjccyy
我在运行 crnn_main.py的时候,因为我的类和预训练模型不一样,所以需要修改,我修改了key.py里面的汉字,再crnn_main修改requires_grad但是没找到修改类的地方,想请假一下在哪里修改啊