Closed BruceLee66 closed 5 years ago
random seed? is the mapping between word and embedding index consistent?
random seed? is the mapping between word and embedding index consistent?
I have trained the model by myself and saved the parameter. pretrained_dict=torch.load('new_model_static.pkl') a='word_embedding.weight' b='copied_word_embedding.weight' pretrained_dict.pop(a) pretrained_dict.pop(b) model_dict=model.state_dict() model_dict.update(pretrained_dict) model.load_state_dict(model_dict) model.eval()
I know the truth,some words do not exit.
good!
I used the same sentence pair into the PWIM model, but each time the result is different, why is this?