Closed PetrochukM closed 6 years ago
Running through the tutorial I ran across this error with the CrfTagger:
2017-12-05 00:00:52,542 - INFO - allennlp.common.params - random_seed = 13370 2017-12-05 00:00:52,542 - INFO - allennlp.common.params - numpy_seed = 1337 2017-12-05 00:00:52,542 - INFO - allennlp.common.params - pytorch_seed = 133 2017-12-05 00:00:52,543 - INFO - allennlp.common.checks - Pytorch version: 0.2.0_3 2017-12-05 00:00:52,544 - INFO - allennlp.common.params - dataset_reader.type = conll2003 2017-12-05 00:00:52,544 - INFO - allennlp.common.params - dataset_reader.token_indexers.tokens.type = single_id 2017-12-05 00:00:52,545 - INFO - allennlp.common.params - dataset_reader.token_indexers.tokens.namespace = tokens 2017-12-05 00:00:52,545 - INFO - allennlp.common.params - dataset_reader.token_indexers.tokens.lowercase_tokens = True 2017-12-05 00:00:52,545 - INFO - allennlp.common.params - dataset_reader.token_indexers.token_characters.type = characters 2017-12-05 00:00:52,545 - INFO - allennlp.common.params - dataset_reader.token_indexers.token_characters.namespace = token_characters 2017-12-05 00:00:52,545 - INFO - allennlp.common.params - dataset_reader.token_indexers.token_characters.character_tokenizer.byte_encoding = None 2017-12-05 00:00:52,545 - INFO - allennlp.common.params - dataset_reader.token_indexers.token_characters.character_tokenizer.lowercase_characters = False 2017-12-05 00:00:52,545 - INFO - allennlp.common.params - dataset_reader.token_indexers.token_characters.character_tokenizer.start_tokens = None 2017-12-05 00:00:52,545 - INFO - allennlp.common.params - dataset_reader.token_indexers.token_characters.character_tokenizer.end_tokens = None 2017-12-05 00:00:52,545 - INFO - allennlp.common.params - dataset_reader.tag_label = ner 2017-12-05 00:00:52,545 - INFO - allennlp.common.params - dataset_reader.feature_labels = () 2017-12-05 00:00:52,545 - INFO - allennlp.common.params - train_data_path = tutorials/getting_started/simple_qa_subject_recognition_train.txt 2017-12-05 00:00:52,545 - INFO - allennlp.commands.train - Reading training data from tutorials/getting_started/simple_qa_subject_recognition_train.txt 2017-12-05 00:00:52,546 - INFO - allennlp.data.dataset_readers.conll2003 - Reading instances from lines in file at: tutorials/getting_started/simple_qa_subject_recognition_train.txt 147355it [00:02, 55775.35it/s] 2017-12-05 00:00:55,266 - INFO - allennlp.common.params - validation_data_path = tutorials/getting_started/simple_qa_subject_recognition_dev.txt 2017-12-05 00:00:55,266 - INFO - allennlp.commands.train - Reading validation data from tutorials/getting_started/simple_qa_subject_recognition_dev.txt 2017-12-05 00:00:55,267 - INFO - allennlp.data.dataset_readers.conll2003 - Reading instances from lines in file at: tutorials/getting_started/simple_qa_subject_recognition_dev.txt 21081it [00:00, 84397.73it/s] 2017-12-05 00:00:55,529 - INFO - allennlp.common.params - test_data_path = None 2017-12-05 00:00:55,529 - INFO - allennlp.commands.train - Creating a vocabulary using train, validation data. 2017-12-05 00:00:55,624 - INFO - allennlp.common.params - vocabulary.directory_path = None 2017-12-05 00:00:55,624 - INFO - allennlp.common.params - vocabulary.min_count = 1 2017-12-05 00:00:55,624 - INFO - allennlp.common.params - vocabulary.max_vocab_size = None 2017-12-05 00:00:55,624 - INFO - allennlp.common.params - vocabulary.non_padded_namespaces = ('*tags', '*labels') 2017-12-05 00:00:55,624 - INFO - allennlp.common.params - vocabulary.only_include_pretrained_words = False 2017-12-05 00:00:55,624 - INFO - allennlp.data.vocabulary - Fitting token dictionary from dataset. 100%|##########| 84219/84219 [00:03<00:00, 22847.17it/s] 2017-12-05 00:00:59,381 - WARNING - root - vocabulary serialization directory tmp/subject_recognition_3/vocabulary is not empty 2017-12-05 00:00:59,497 - INFO - allennlp.common.params - model.type = crf_tagger 2017-12-05 00:00:59,497 - INFO - allennlp.common.params - model.text_field_embedder.type = basic 2017-12-05 00:00:59,497 - INFO - allennlp.common.params - model.text_field_embedder.tokens.type = embedding 2017-12-05 00:00:59,497 - INFO - allennlp.common.params - model.text_field_embedder.tokens.num_embeddings = None 2017-12-05 00:00:59,497 - INFO - allennlp.common.params - model.text_field_embedder.tokens.vocab_namespace = tokens 2017-12-05 00:00:59,497 - INFO - allennlp.common.params - model.text_field_embedder.tokens.embedding_dim = 50 2017-12-05 00:00:59,497 - INFO - allennlp.common.params - model.text_field_embedder.tokens.pretrained_file = https://s3-us-west-2.amazonaws.com/allennlp/datasets/glove/glove.6B.50d.txt.gz 2017-12-05 00:00:59,497 - INFO - allennlp.common.params - model.text_field_embedder.tokens.projection_dim = None 2017-12-05 00:00:59,497 - INFO - allennlp.common.params - model.text_field_embedder.tokens.trainable = True 2017-12-05 00:00:59,497 - INFO - allennlp.common.params - model.text_field_embedder.tokens.padding_index = None 2017-12-05 00:00:59,497 - INFO - allennlp.common.params - model.text_field_embedder.tokens.max_norm = None 2017-12-05 00:00:59,497 - INFO - allennlp.common.params - model.text_field_embedder.tokens.norm_type = 2.0 2017-12-05 00:00:59,498 - INFO - allennlp.common.params - model.text_field_embedder.tokens.scale_grad_by_freq = False 2017-12-05 00:00:59,498 - INFO - allennlp.common.params - model.text_field_embedder.tokens.sparse = False 2017-12-05 00:00:59,501 - INFO - allennlp.modules.token_embedders.embedding - Reading embeddings from file 2017-12-05 00:01:02,648 - INFO - allennlp.modules.token_embedders.embedding - Initializing pre-trained embedding layer 2017-12-05 00:01:02,894 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.type = character_encoding 2017-12-05 00:01:02,895 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.num_embeddings = None 2017-12-05 00:01:02,896 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.vocab_namespace = token_characters 2017-12-05 00:01:02,896 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.embedding_dim = 25 2017-12-05 00:01:02,897 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.pretrained_file = None 2017-12-05 00:01:02,897 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.projection_dim = None 2017-12-05 00:01:02,898 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.trainable = True 2017-12-05 00:01:02,898 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.padding_index = None 2017-12-05 00:01:02,898 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.max_norm = None 2017-12-05 00:01:02,899 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.norm_type = 2.0 2017-12-05 00:01:02,899 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.scale_grad_by_freq = False 2017-12-05 00:01:02,900 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.sparse = False 2017-12-05 00:01:02,901 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.encoder.type = gru 2017-12-05 00:01:02,901 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.encoder.batch_first = True 2017-12-05 00:01:02,902 - INFO - allennlp.common.params - Converting Params object to dict; logging of default values will not occur when dictionary parameters are used subsequently. 2017-12-05 00:01:02,902 - INFO - allennlp.common.params - CURRENTLY DEFINED PARAMETERS: 2017-12-05 00:01:02,903 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.encoder.input_size = 25 2017-12-05 00:01:02,903 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.encoder.hidden_size = 80 2017-12-05 00:01:02,904 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.encoder.num_layers = 2 2017-12-05 00:01:02,904 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.encoder.dropout = 0.25 2017-12-05 00:01:02,904 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.encoder.bidirectional = True 2017-12-05 00:01:02,905 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.encoder.batch_first = True 2017-12-05 00:01:02,907 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.dropout = 0.0 2017-12-05 00:01:02,908 - INFO - allennlp.common.params - model.encoder.type = gru 2017-12-05 00:01:02,908 - INFO - allennlp.common.params - model.encoder.batch_first = True 2017-12-05 00:01:02,908 - INFO - allennlp.common.params - Converting Params object to dict; logging of default values will not occur when dictionary parameters are used subsequently. 2017-12-05 00:01:02,908 - INFO - allennlp.common.params - CURRENTLY DEFINED PARAMETERS: 2017-12-05 00:01:02,908 - INFO - allennlp.common.params - model.encoder.input_size = 210 2017-12-05 00:01:02,908 - INFO - allennlp.common.params - model.encoder.hidden_size = 300 2017-12-05 00:01:02,909 - INFO - allennlp.common.params - model.encoder.num_layers = 2 2017-12-05 00:01:02,909 - INFO - allennlp.common.params - model.encoder.dropout = 0.5 2017-12-05 00:01:02,909 - INFO - allennlp.common.params - model.encoder.bidirectional = True 2017-12-05 00:01:02,909 - INFO - allennlp.common.params - model.encoder.batch_first = True 2017-12-05 00:01:02,923 - INFO - allennlp.common.params - model.label_namespace = labels 2017-12-05 00:01:02,924 - INFO - allennlp.common.params - model.initializer = [] 2017-12-05 00:01:02,924 - INFO - allennlp.common.params - model.regularizer = [['transitions$', ConfigTree([('type', 'l2'), ('alpha', 0.01)])]] 2017-12-05 00:01:02,924 - INFO - allennlp.common.params - model.regularizer.list.list.type = l2 2017-12-05 00:01:02,924 - INFO - allennlp.nn.initializers - Initializing parameters 2017-12-05 00:01:02,924 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code 2017-12-05 00:01:02,924 - INFO - allennlp.nn.initializers - crf.end_transitions 2017-12-05 00:01:02,924 - INFO - allennlp.nn.initializers - crf.start_transitions 2017-12-05 00:01:02,924 - INFO - allennlp.nn.initializers - crf.transitions 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.bias_hh_l0 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.bias_hh_l0_reverse 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.bias_hh_l1 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.bias_hh_l1_reverse 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.bias_ih_l0 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.bias_ih_l0_reverse 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.bias_ih_l1 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.bias_ih_l1_reverse 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.weight_hh_l0 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.weight_hh_l0_reverse 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.weight_hh_l1 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.weight_hh_l1_reverse 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.weight_ih_l0 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.weight_ih_l0_reverse 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.weight_ih_l1 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - encoder._module.weight_ih_l1_reverse 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - tag_projection_layer._module.bias 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - tag_projection_layer._module.weight 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._embedding._module.weight 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.bias_hh_l0 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.bias_hh_l0_reverse 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.bias_hh_l1 2017-12-05 00:01:02,925 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.bias_hh_l1_reverse 2017-12-05 00:01:02,926 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.bias_ih_l0 2017-12-05 00:01:02,926 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.bias_ih_l0_reverse 2017-12-05 00:01:02,926 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.bias_ih_l1 2017-12-05 00:01:02,926 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.bias_ih_l1_reverse 2017-12-05 00:01:02,926 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.weight_hh_l0 2017-12-05 00:01:02,926 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.weight_hh_l0_reverse 2017-12-05 00:01:02,926 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.weight_hh_l1 2017-12-05 00:01:02,926 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.weight_hh_l1_reverse 2017-12-05 00:01:02,926 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.weight_ih_l0 2017-12-05 00:01:02,926 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.weight_ih_l0_reverse 2017-12-05 00:01:02,926 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.weight_ih_l1 2017-12-05 00:01:02,926 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_token_characters._encoder._module._module.weight_ih_l1_reverse 2017-12-05 00:01:02,926 - INFO - allennlp.nn.initializers - text_field_embedder.token_embedder_tokens.weight 2017-12-05 00:01:02,926 - INFO - allennlp.common.params - iterator.type = basic 2017-12-05 00:01:02,926 - INFO - allennlp.common.params - iterator.batch_size = 32 2017-12-05 00:01:02,926 - INFO - allennlp.data.dataset - Indexing dataset 100%|##########| 73678/73678 [00:05<00:00, 12670.97it/s] 2017-12-05 00:01:08,741 - INFO - allennlp.data.dataset - Indexing dataset 100%|##########| 10541/10541 [00:01<00:00, 8605.51it/s] 2017-12-05 00:01:09,967 - INFO - allennlp.common.params - trainer.patience = 10 2017-12-05 00:01:09,967 - INFO - allennlp.common.params - trainer.validation_metric = -loss 2017-12-05 00:01:09,967 - INFO - allennlp.common.params - trainer.num_epochs = 50 2017-12-05 00:01:09,967 - INFO - allennlp.common.params - trainer.cuda_device = 0 2017-12-05 00:01:09,968 - INFO - allennlp.common.params - trainer.grad_norm = None 2017-12-05 00:01:09,968 - INFO - allennlp.common.params - trainer.grad_clipping = None 2017-12-05 00:01:09,968 - INFO - allennlp.common.params - trainer.learning_rate_scheduler = None 2017-12-05 00:01:11,782 - INFO - allennlp.common.params - trainer.optimizer = adam 2017-12-05 00:01:11,782 - INFO - allennlp.common.params - Converting Params object to dict; logging of default values will not occur when dictionary parameters are used subsequently. 2017-12-05 00:01:11,782 - INFO - allennlp.common.params - CURRENTLY DEFINED PARAMETERS: 2017-12-05 00:01:11,782 - INFO - allennlp.common.params - trainer.no_tqdm = False 2017-12-05 00:01:11,788 - INFO - allennlp.common.params - evaluate_on_test = False 2017-12-05 00:01:11,788 - INFO - allennlp.training.trainer - Beginning training. 2017-12-05 00:01:11,789 - INFO - allennlp.training.trainer - Epoch 0/49 0%| | 0/2303 [00:00<?, ?it/s]2017-12-05 00:01:11,789 - INFO - allennlp.training.trainer - Training Traceback (most recent call last): File "/usr/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/usr/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/home/michael/Desktop/lattice/allennlp/allennlp/run.py", line 13, in <module> main(prog="python -m allennlp.run") File "/home/michael/Desktop/lattice/allennlp/allennlp/commands/__init__.py", line 77, in main args.func(args) File "/home/michael/Desktop/lattice/allennlp/allennlp/commands/train.py", line 73, in train_model_from_args train_model_from_file(args.param_path, args.serialization_dir) File "/home/michael/Desktop/lattice/allennlp/allennlp/commands/train.py", line 89, in train_model_from_file return train_model(params, serialization_dir) File "/home/michael/Desktop/lattice/allennlp/allennlp/commands/train.py", line 178, in train_model trainer.train() File "/home/michael/Desktop/lattice/allennlp/allennlp/training/trainer.py", line 369, in train train_metrics = self._train_epoch(epoch) File "/home/michael/Desktop/lattice/allennlp/allennlp/training/trainer.py", line 222, in _train_epoch loss.backward() File "/usr/local/lib/python3.6/dist-packages/torch/autograd/variable.py", line 156, in backward torch.autograd.backward(self, gradient, retain_graph, create_graph, retain_variables) File "/usr/local/lib/python3.6/dist-packages/torch/autograd/__init__.py", line 98, in backward variables, grad_variables, retain_graph) RuntimeError: function ConcatBackward returned a gradient different than None at position 3, but the corresponding forward input was not a Variable
This is caused by #572, I am working on a fix today - thanks for reporting.
Running through the tutorial I ran across this error with the CrfTagger: