feng-yufei / Neural-Natural-Logic

Implementation of the first neural natural logic paper on natural language inference
12 stars 1 forks source link

Word embeddings size mismatch #3

Open AnaRadu24 opened 2 years ago

AnaRadu24 commented 2 years ago

When trying to run aligner.py (after the prepro code to get snli data, vocab, word embedding) I get the following error. Could you please provide the precomputed word embeddings with the correct number of embeddings or suggest how to fix the issue?

Training Samples: 549367 Loaded Training Samples: 9824 Loaded Training Samples: 9842 Loaded ESIM_Aligner( (_word_embedding): Embedding(34023, 300, padding_idx=0) (_rnn_dropout): RNNDropout(p=0.5, inplace=False) (_encoding): Seq2SeqEncoder( (_encoder): LSTM(300, 300, batch_first=True, bidirectional=True) ) (_attention): SoftmaxAttention() (_projection): Sequential( (0): Linear(in_features=2400, out_features=300, bias=True) (1): ReLU() ) (_composition): Seq2SeqEncoder( (_encoder): LSTM(300, 300, batch_first=True, bidirectional=True) ) (_classification): Sequential( (0): Dropout(p=0.5, inplace=False) (1): Linear(in_features=2400, out_features=300, bias=True) (2): Tanh() (3): Dropout(p=0.5, inplace=False) (4): Linear(in_features=300, out_features=3, bias=True) ) ) Loading pretrained model : ./results/esim_saved_model_snli-20200323-184427/esim_model.pt Traceback (most recent call last): File "aligner.py", line 88, in esim_model.load_state_dict(torch.load(init_checkpoint)) File "/home/amr97/pytorch-env/lib/python3.6/site-packages/torch/nn/modules/module.py", line 1483, in load_state_dict self.class.name, "\n\t".join(error_msgs))) RuntimeError: Error(s) in loading state_dict for ESIM_Aligner: size mismatch for _word_embedding.weight: copying a param with shape torch.Size([36652, 300]) from checkpoint, the shape in current model is torch.Size([34023, 300]).

feng-yufei commented 2 years ago

please train a esim model based on the file you preprocessed, the vocab may differ based on the version of the tokenizer