Open violet-hai opened 3 years ago
Hi,
Can you make sure to delete any cached preprocessing data and try again?
Best, Alex
On Sat, Nov 14, 2020 at 9:42 AM vioelt notifications@github.com wrote:
When i set --elmo to 1 , i have got an error. I don’t know allennlp well, and it took a few days to still not be able to solve the following error. If anyone knows a solution, please leave me a message, thank you very much!
Traceback (most recent call last): File "/home/GLUE-baselines-master/src/main.py", line 280, in sys.exit(main(sys.argv[1:])) File "/home/GLUE-baselines-master/src/main.py", line 177, in main args.load_model) File "/home/GLUE-baselines-master/src/trainer.py", line 240, in train output_dict = self._forward(batch, task=task, for_training=True) File "/home/GLUE-baselines-master/src/trainer.py", line 464, in _forward return self._model.forward(task, tensor_batch) File "/home/GLUE-baselines-master/src/models.py", line 219, in forward sent_emb = self.sent_encoder(input1) File "/home/anaconda3/envs/glue/lib/python3.6/site-packages/torch/nn/modules/module.py", line 491, in call result = self.forward(*input, *kwargs) File "/home/GLUE-baselines-master/src/models.py", line 413, in forward sent_embs = self._highway_layer(self._text_field_embedder(sent)) File "/home/anaconda3/envs/glue/lib/python3.6/site-packages/torch/nn/modules/module.py", line 491, in call result = self.forward(input, kwargs) File "/home/anaconda3/envs/glue/lib/python3.6/site-packages/allennlp/modules/text_field_embedders/basic_text_field_embedder.py", line 63, in forward raise ConfigurationError(message) allennlp.common.checks.ConfigurationError: "Mismatched token keys: dict_keys(['words']) and dict_keys(['elmo', 'words'])"
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/nyu-mll/GLUE-baselines/issues/20, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABKDWGZHO7UBSBEKEHDB33DSP2JOPANCNFSM4TVSSCSA .
When i set --elmo to 1 , i have got an error. I don’t know allennlp well, and it took a few days to still not be able to solve the following error. If anyone knows a solution, please leave me a message, thank you very much!
Traceback (most recent call last): File "/home/GLUE-baselines-master/src/main.py", line 280, in
sys.exit(main(sys.argv[1:]))
File "/home/GLUE-baselines-master/src/main.py", line 177, in main
args.load_model)
File "/home/GLUE-baselines-master/src/trainer.py", line 240, in train
output_dict = self._forward(batch, task=task, for_training=True)
File "/home/GLUE-baselines-master/src/trainer.py", line 464, in _forward
return self._model.forward(task, tensor_batch)
File "/home/GLUE-baselines-master/src/models.py", line 219, in forward
sent_emb = self.sent_encoder(input1)
File "/home/anaconda3/envs/glue/lib/python3.6/site-packages/torch/nn/modules/module.py", line 491, in call
result = self.forward(*input, *kwargs)
File "/home/GLUE-baselines-master/src/models.py", line 413, in forward
sent_embs = self._highway_layer(self._text_field_embedder(sent))
File "/home/anaconda3/envs/glue/lib/python3.6/site-packages/torch/nn/modules/module.py", line 491, in call
result = self.forward(input, kwargs)
File "/home/anaconda3/envs/glue/lib/python3.6/site-packages/allennlp/modules/text_field_embedders/basic_text_field_embedder.py", line 63, in forward
raise ConfigurationError(message)
allennlp.common.checks.ConfigurationError: "Mismatched token keys: dict_keys(['words']) and dict_keys(['elmo', 'words'])"