dwadden / dygiepp

Span-based system for named entity, relation, and event extraction.
MIT License
569 stars 120 forks source link

Some problem in ACE05-Event Predict #95

Closed Ricardokevins closed 2 years ago

Ricardokevins commented 2 years ago

Hi, Awesome and complete work. I want to use Trained ACE05-eventExtraction model as API to Extract Event. And I carefully Read the README.md, Click the model link in PretrainedModel Section (got a tar.xz file) And i use txt file ( one document per line ) and use the format_new_dataset.py create the data (from 2 txt file) After that I use the predict script under

allennlp predict Mydata/ace05-event.tar.gz \
    Mydata/test_data.jsonl \
    --predictor dygie \
    --include-package dygie \
    --use-dataset-reader \
    --output-file Mydata/result.json \
    --cuda-device 0 \
    --silent

However I meet ERROR:

line 325, in make_output_human_readable
    for predictions, sentence in zip(output_dict["ner"]["predictions"], doc):
KeyError: 'predictions'
Ricardokevins commented 2 years ago

I try to fix error myself, so i print the output_dict

{'coref': {'loss': 0}, 'relation': {'loss': 0}, 'ner': {'loss': 0}, 'events': {'loss': 0}, 'loss': tensor(0.), 'metadata': 0: Marseille prosecutor says " so far no videos were used in the crash investigation " despite media reports .
1: Journalists at Bild and Paris Match are " very confident " the video clip is real , an editor says .
2: Andreas Lubitz had informed his Lufthansa training school of an episode of severe depression , airline says .
3: 
 Membership gives the ICC jurisdiction over alleged crimes committed in Palestinian territories since last June .
4: Israel and the United States opposed the move , which could open the door to war crimes investigations against Israelis .
5: 
 Amnesty 's annual death penalty report catalogs encouraging signs , but setbacks in numbers of those sentenced to death .
6: Organization claims that governments around the world are using the threat of terrorism to advance executions .
7: The number of executions worldwide has gone down by almost 22 % compared with 2013 , but death sentences up by 28 % .
8: 
 Amnesty International releases its annual review of the death penalty worldwide ; much of it makes for grim reading .
9: Salil Shetty : Countries that use executions to deal with problems are on the wrong side of history .
10: 
 Museum : Anne Frank died earlier than previously believed .
11: Researchers re - examined archives and testimonies of survivors .
12: Anne and older sister Margot Frank are believed to have died in February 1945 .
13: 
 Student is no longer on Duke University campus and will face disciplinary review .
14: School officials identified student during investigation and the person admitted to hanging the noose , Duke says .
15: The noose , made of rope , was discovered on campus about 2 a.m.
16: 
 The Rev. Robert Schuller , 88 , had been diagnosed with esophageal cancer in 2013 .
17: His TV show , " Hour of Power , " was enormously popular in the 1970s and 1980s .
18: 
 Theia , a bully breed mix , was apparently hit by a car , whacked with a hammer and buried in a field .
19: " She 's a true miracle dog and she deserves a good life , " says Sara Mellado , who is looking for a home for Theia .
20: 
 Mohammad Javad Zarif has spent more time with John Kerry than any other foreign minister .
21: He once participated in a takeover of the Iranian Consulate in San Francisco .
22: The Iranian foreign minister tweets in English .
23: 
 Bob Barker returned to host " The Price Is Right " on Wednesday .
24: Barker , 91 , had retired as host in 2007 .
25: 
 College - bound basketball star asks girl with Down syndrome to high school prom .
26: Pictures of the two during the " prom - posal " have gone viral .
27: 

28: Former GOP representative compares President Obama to Andreas Lubitz .
29: Bachmann said with possible Iran deal , Obama will fly " entire nation into the rocks " Reaction on social media ?
30: She was blasted by Facebook commenters .
31: 
}
{'coref': 0, 'events': 1, 'ner': 0.5, 'relation': 0.5}
Ricardokevins commented 2 years ago

full log as follows:

2022-03-12 18:09:00,918 - INFO - allennlp.common.plugins - Plugin allennlp_models available
2022-03-12 18:09:01,029 - INFO - allennlp.models.archival - loading archive file Mydata/ace05-event.tar.gz
2022-03-12 18:09:01,030 - INFO - allennlp.models.archival - extracting archive file Mydata/ace05-event.tar.gz to temp dir /tmp/tmpl55upr30
2022-03-12 18:09:06,447 - INFO - allennlp.common.params - type = from_instances
2022-03-12 18:09:06,448 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmpl55upr30/vocabulary.
2022-03-12 18:09:06,449 - INFO - allennlp.common.params - model.type = dygie
2022-03-12 18:09:06,450 - INFO - allennlp.common.params - model.embedder.type = basic
2022-03-12 18:09:06,453 - INFO - allennlp.common.params - model.embedder.token_embedders.bert.type = pretrained_transformer_mismatched
2022-03-12 18:09:06,453 - INFO - allennlp.common.params - model.embedder.token_embedders.bert.model_name = roberta-base
2022-03-12 18:09:06,453 - INFO - allennlp.common.params - model.embedder.token_embedders.bert.max_length = 512
2022-03-12 18:09:06,453 - INFO - allennlp.common.params - model.embedder.token_embedders.bert.train_parameters = True
2022-03-12 18:09:06,454 - INFO - allennlp.common.params - model.embedder.token_embedders.bert.last_layer_only = True
2022-03-12 18:09:06,454 - INFO - allennlp.common.params - model.embedder.token_embedders.bert.gradient_checkpointing = None
roberta-base
2022-03-12 18:09:06,455 - INFO - transformers.configuration_utils - loading configuration file /root/SheShuaijie/Data/PLM/roberta-base/config.json
2022-03-12 18:09:06,456 - INFO - transformers.configuration_utils - Model config RobertaConfig {
  "architectures": [
    "RobertaForMaskedLM"
  ],
  "attention_probs_dropout_prob": 0.1,
  "bos_token_id": 0,
  "eos_token_id": 2,
  "gradient_checkpointing": false,
  "hidden_act": "gelu",
  "hidden_dropout_prob": 0.1,
  "hidden_size": 768,
  "initializer_range": 0.02,
  "intermediate_size": 3072,
  "layer_norm_eps": 1e-05,
  "max_position_embeddings": 514,
  "model_type": "roberta",
  "num_attention_heads": 12,
  "num_hidden_layers": 12,
  "pad_token_id": 1,
  "type_vocab_size": 1,
  "vocab_size": 50265
}

2022-03-12 18:09:06,456 - INFO - transformers.modeling_utils - loading weights file /root/SheShuaijie/Data/PLM/roberta-base/pytorch_model.bin
2022-03-12 18:09:12,244 - INFO - transformers.modeling_utils - All model checkpoint weights were used when initializing RobertaModel.

2022-03-12 18:09:12,245 - INFO - transformers.modeling_utils - All the weights of RobertaModel were initialized from the model checkpoint at /root/SheShuaijie/Data/PLM/roberta-base.
If your task is similar to the task the model of the ckeckpoint was trained on, you can already use RobertaModel for predictions without further training.
2022-03-12 18:09:12,394 - INFO - transformers.configuration_utils - loading configuration file /root/SheShuaijie/Data/PLM/roberta-base/config.json
2022-03-12 18:09:12,395 - INFO - transformers.configuration_utils - Model config RobertaConfig {
  "architectures": [
    "RobertaForMaskedLM"
  ],
  "attention_probs_dropout_prob": 0.1,
  "bos_token_id": 0,
  "eos_token_id": 2,
  "gradient_checkpointing": false,
  "hidden_act": "gelu",
  "hidden_dropout_prob": 0.1,
  "hidden_size": 768,
  "initializer_range": 0.02,
  "intermediate_size": 3072,
  "layer_norm_eps": 1e-05,
  "max_position_embeddings": 514,
  "model_type": "roberta",
  "num_attention_heads": 12,
  "num_hidden_layers": 12,
  "pad_token_id": 1,
  "type_vocab_size": 1,
  "vocab_size": 50265
}

2022-03-12 18:09:12,396 - INFO - transformers.tokenization_utils_base - Model name '/root/SheShuaijie/Data/PLM/roberta-base' not found in model shortcut name list (roberta-base, roberta-large, roberta-large-mnli, distilroberta-base, roberta-base-openai-detector, roberta-large-openai-detector). Assuming '/root/SheShuaijie/Data/PLM/roberta-base' is a path, a model identifier, or url to a directory containing tokenizer files.
2022-03-12 18:09:12,396 - INFO - transformers.tokenization_utils_base - Didn't find file /root/SheShuaijie/Data/PLM/roberta-base/added_tokens.json. We won't load it.
2022-03-12 18:09:12,396 - INFO - transformers.tokenization_utils_base - Didn't find file /root/SheShuaijie/Data/PLM/roberta-base/special_tokens_map.json. We won't load it.
2022-03-12 18:09:12,396 - INFO - transformers.tokenization_utils_base - Didn't find file /root/SheShuaijie/Data/PLM/roberta-base/tokenizer_config.json. We won't load it.
2022-03-12 18:09:12,396 - INFO - transformers.tokenization_utils_base - loading file /root/SheShuaijie/Data/PLM/roberta-base/vocab.json
2022-03-12 18:09:12,396 - INFO - transformers.tokenization_utils_base - loading file /root/SheShuaijie/Data/PLM/roberta-base/merges.txt
2022-03-12 18:09:12,396 - INFO - transformers.tokenization_utils_base - loading file None
2022-03-12 18:09:12,396 - INFO - transformers.tokenization_utils_base - loading file None
2022-03-12 18:09:12,396 - INFO - transformers.tokenization_utils_base - loading file None
2022-03-12 18:09:12,397 - INFO - transformers.tokenization_utils_base - loading file /root/SheShuaijie/Data/PLM/roberta-base/tokenizer.json
2022-03-12 18:09:12,488 - INFO - transformers.configuration_utils - loading configuration file /root/SheShuaijie/Data/PLM/roberta-base/config.json
2022-03-12 18:09:12,489 - INFO - transformers.configuration_utils - Model config RobertaConfig {
  "architectures": [
    "RobertaForMaskedLM"
  ],
  "attention_probs_dropout_prob": 0.1,
  "bos_token_id": 0,
  "eos_token_id": 2,
  "gradient_checkpointing": false,
  "hidden_act": "gelu",
  "hidden_dropout_prob": 0.1,
  "hidden_size": 768,
  "initializer_range": 0.02,
  "intermediate_size": 3072,
  "layer_norm_eps": 1e-05,
  "max_position_embeddings": 514,
  "model_type": "roberta",
  "num_attention_heads": 12,
  "num_hidden_layers": 12,
  "pad_token_id": 1,
  "type_vocab_size": 1,
  "vocab_size": 50265
}

2022-03-12 18:09:12,490 - INFO - transformers.tokenization_utils_base - Model name '/root/SheShuaijie/Data/PLM/roberta-base' not found in model shortcut name list (roberta-base, roberta-large, roberta-large-mnli, distilroberta-base, roberta-base-openai-detector, roberta-large-openai-detector). Assuming '/root/SheShuaijie/Data/PLM/roberta-base' is a path, a model identifier, or url to a directory containing tokenizer files.
2022-03-12 18:09:12,490 - INFO - transformers.tokenization_utils_base - Didn't find file /root/SheShuaijie/Data/PLM/roberta-base/added_tokens.json. We won't load it.
2022-03-12 18:09:12,490 - INFO - transformers.tokenization_utils_base - Didn't find file /root/SheShuaijie/Data/PLM/roberta-base/special_tokens_map.json. We won't load it.
2022-03-12 18:09:12,490 - INFO - transformers.tokenization_utils_base - Didn't find file /root/SheShuaijie/Data/PLM/roberta-base/tokenizer_config.json. We won't load it.
2022-03-12 18:09:12,490 - INFO - transformers.tokenization_utils_base - loading file /root/SheShuaijie/Data/PLM/roberta-base/vocab.json
2022-03-12 18:09:12,491 - INFO - transformers.tokenization_utils_base - loading file /root/SheShuaijie/Data/PLM/roberta-base/merges.txt
2022-03-12 18:09:12,491 - INFO - transformers.tokenization_utils_base - loading file None
2022-03-12 18:09:12,491 - INFO - transformers.tokenization_utils_base - loading file None
2022-03-12 18:09:12,491 - INFO - transformers.tokenization_utils_base - loading file None
2022-03-12 18:09:12,491 - INFO - transformers.tokenization_utils_base - loading file /root/SheShuaijie/Data/PLM/roberta-base/tokenizer.json
2022-03-12 18:09:12,579 - INFO - allennlp.common.params - model.feature_size = 20
2022-03-12 18:09:12,579 - INFO - allennlp.common.params - model.max_span_width = 8
2022-03-12 18:09:12,579 - INFO - allennlp.common.params - model.target_task = events
2022-03-12 18:09:12,580 - INFO - allennlp.common.params - model.initializer.regexes.0.1.type = xavier_normal
2022-03-12 18:09:12,580 - INFO - allennlp.common.params - model.initializer.regexes.0.1.gain = 1.0
2022-03-12 18:09:12,580 - INFO - allennlp.common.params - model.initializer.prevent_regexes = None
2022-03-12 18:09:12,581 - INFO - allennlp.common.params - model.module_initializer.regexes.0.1.type = xavier_normal
2022-03-12 18:09:12,581 - INFO - allennlp.common.params - model.module_initializer.regexes.0.1.gain = 1.0
2022-03-12 18:09:12,581 - INFO - allennlp.common.params - model.module_initializer.regexes.1.1.type = xavier_normal
2022-03-12 18:09:12,581 - INFO - allennlp.common.params - model.module_initializer.regexes.1.1.gain = 1.0
2022-03-12 18:09:12,582 - INFO - allennlp.common.params - model.module_initializer.prevent_regexes = None
2022-03-12 18:09:12,582 - INFO - allennlp.common.params - model.regularizer = None
2022-03-12 18:09:12,582 - INFO - allennlp.common.params - model.display_metrics = None
2022-03-12 18:09:12,583 - INFO - allennlp.common.params - ner.regularizer = None
2022-03-12 18:09:12,586 - INFO - allennlp.common.params - coref.spans_per_word = 0.3
2022-03-12 18:09:12,586 - INFO - allennlp.common.params - coref.max_antecedents = 100
2022-03-12 18:09:12,586 - INFO - allennlp.common.params - coref.coref_prop = 0
2022-03-12 18:09:12,586 - INFO - allennlp.common.params - coref.coref_prop_dropout_f = 0.0
2022-03-12 18:09:12,586 - INFO - allennlp.common.params - coref.regularizer = None
2022-03-12 18:09:12,637 - INFO - allennlp.common.params - relation.spans_per_word = 0.5
2022-03-12 18:09:12,637 - INFO - allennlp.common.params - relation.positive_label_weight = 1.0
2022-03-12 18:09:12,637 - INFO - allennlp.common.params - relation.regularizer = None
2022-03-12 18:09:12,648 - INFO - allennlp.common.params - events.trigger_spans_per_word = 0.3
2022-03-12 18:09:12,648 - INFO - allennlp.common.params - events.argument_spans_per_word = 0.8
2022-03-12 18:09:12,648 - INFO - allennlp.common.params - events.regularizer = None
2022-03-12 18:09:12,659 - INFO - allennlp.nn.initializers - Initializing parameters
2022-03-12 18:09:12,659 - INFO - allennlp.nn.initializers - Initializing _ner_scorers.ace-event__ner_labels.0._module._linear_layers.0.weight using .*weight initializer
2022-03-12 18:09:12,663 - INFO - allennlp.nn.initializers - Initializing _ner_scorers.ace-event__ner_labels.0._module._linear_layers.1.weight using .*weight initializer
2022-03-12 18:09:12,663 - INFO - allennlp.nn.initializers - Initializing _ner_scorers.ace-event__ner_labels.1._module.weight using .*weight initializer
2022-03-12 18:09:12,663 - WARNING - allennlp.nn.initializers - Did not use initialization regex that was passed: .*weight_matrix
2022-03-12 18:09:12,663 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code
2022-03-12 18:09:12,663 - INFO - allennlp.nn.initializers -    _ner_scorers.ace-event__ner_labels.0._module._linear_layers.0.bias
2022-03-12 18:09:12,663 - INFO - allennlp.nn.initializers -    _ner_scorers.ace-event__ner_labels.0._module._linear_layers.1.bias
2022-03-12 18:09:12,663 - INFO - allennlp.nn.initializers -    _ner_scorers.ace-event__ner_labels.1._module.bias
2022-03-12 18:09:12,663 - INFO - allennlp.nn.initializers - Initializing parameters
2022-03-12 18:09:12,664 - INFO - allennlp.nn.initializers - Initializing _distance_embedding.weight using .*weight initializer
2022-03-12 18:09:12,664 - INFO - allennlp.nn.initializers - Initializing _antecedent_feedforward._module._linear_layers.0.weight using .*weight initializer
2022-03-12 18:09:12,672 - INFO - allennlp.nn.initializers - Initializing _antecedent_feedforward._module._linear_layers.1.weight using .*weight initializer
2022-03-12 18:09:12,673 - INFO - allennlp.nn.initializers - Initializing _mention_pruner._scorer.0._module._linear_layers.0.weight using .*weight initializer
2022-03-12 18:09:12,676 - INFO - allennlp.nn.initializers - Initializing _mention_pruner._scorer.0._module._linear_layers.1.weight using .*weight initializer
2022-03-12 18:09:12,676 - INFO - allennlp.nn.initializers - Initializing _mention_pruner._scorer.1._module.weight using .*weight initializer
2022-03-12 18:09:12,676 - INFO - allennlp.nn.initializers - Initializing _antecedent_scorer._module.weight using .*weight initializer
2022-03-12 18:09:12,676 - INFO - allennlp.nn.initializers - Initializing _f_network._linear_layers.0.weight using .*weight initializer
2022-03-12 18:09:12,730 - WARNING - allennlp.nn.initializers - Did not use initialization regex that was passed: .*weight_matrix
2022-03-12 18:09:12,730 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code
2022-03-12 18:09:12,730 - INFO - allennlp.nn.initializers -    _antecedent_feedforward._module._linear_layers.0.bias
2022-03-12 18:09:12,731 - INFO - allennlp.nn.initializers -    _antecedent_feedforward._module._linear_layers.1.bias
2022-03-12 18:09:12,731 - INFO - allennlp.nn.initializers -    _antecedent_scorer._module.bias
2022-03-12 18:09:12,731 - INFO - allennlp.nn.initializers -    _f_network._linear_layers.0.bias
2022-03-12 18:09:12,731 - INFO - allennlp.nn.initializers -    _mention_pruner._scorer.0._module._linear_layers.0.bias
2022-03-12 18:09:12,731 - INFO - allennlp.nn.initializers -    _mention_pruner._scorer.0._module._linear_layers.1.bias
2022-03-12 18:09:12,731 - INFO - allennlp.nn.initializers -    _mention_pruner._scorer.1._module.bias
2022-03-12 18:09:12,731 - INFO - allennlp.nn.initializers - Initializing parameters
2022-03-12 18:09:12,731 - INFO - allennlp.nn.initializers - Initializing _mention_pruners.ace-event__relation_labels._scorer.0._module._linear_layers.0.weight using .*weight initializer
2022-03-12 18:09:12,734 - INFO - allennlp.nn.initializers - Initializing _mention_pruners.ace-event__relation_labels._scorer.0._module._linear_layers.1.weight using .*weight initializer
2022-03-12 18:09:12,734 - INFO - allennlp.nn.initializers - Initializing _mention_pruners.ace-event__relation_labels._scorer.1._module.weight using .*weight initializer
2022-03-12 18:09:12,735 - INFO - allennlp.nn.initializers - Initializing _relation_feedforwards.ace-event__relation_labels._linear_layers.0.weight using .*weight initializer
2022-03-12 18:09:12,743 - INFO - allennlp.nn.initializers - Initializing _relation_feedforwards.ace-event__relation_labels._linear_layers.1.weight using .*weight initializer
2022-03-12 18:09:12,744 - INFO - allennlp.nn.initializers - Initializing _relation_scorers.ace-event__relation_labels.weight using .*weight initializer
2022-03-12 18:09:12,744 - WARNING - allennlp.nn.initializers - Did not use initialization regex that was passed: .*weight_matrix
2022-03-12 18:09:12,744 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code
2022-03-12 18:09:12,744 - INFO - allennlp.nn.initializers -    _mention_pruners.ace-event__relation_labels._scorer.0._module._linear_layers.0.bias
2022-03-12 18:09:12,744 - INFO - allennlp.nn.initializers -    _mention_pruners.ace-event__relation_labels._scorer.0._module._linear_layers.1.bias
2022-03-12 18:09:12,744 - INFO - allennlp.nn.initializers -    _mention_pruners.ace-event__relation_labels._scorer.1._module.bias
2022-03-12 18:09:12,744 - INFO - allennlp.nn.initializers -    _relation_feedforwards.ace-event__relation_labels._linear_layers.0.bias
2022-03-12 18:09:12,744 - INFO - allennlp.nn.initializers -    _relation_feedforwards.ace-event__relation_labels._linear_layers.1.bias
2022-03-12 18:09:12,744 - INFO - allennlp.nn.initializers -    _relation_scorers.ace-event__relation_labels.bias
2022-03-12 18:09:12,744 - INFO - allennlp.nn.initializers - Initializing parameters
2022-03-12 18:09:12,744 - INFO - allennlp.nn.initializers - Initializing _trigger_scorers.ace-event__trigger_labels.0._module._linear_layers.0.weight using .*weight initializer
2022-03-12 18:09:12,746 - INFO - allennlp.nn.initializers - Initializing _trigger_scorers.ace-event__trigger_labels.0._module._linear_layers.1.weight using .*weight initializer
2022-03-12 18:09:12,746 - INFO - allennlp.nn.initializers - Initializing _trigger_scorers.ace-event__trigger_labels.1._module.weight using .*weight initializer
2022-03-12 18:09:12,746 - INFO - allennlp.nn.initializers - Initializing _trigger_pruners.ace-event__trigger_labels._scorer.0._module._linear_layers.0.weight using .*weight initializer
2022-03-12 18:09:12,748 - INFO - allennlp.nn.initializers - Initializing _trigger_pruners.ace-event__trigger_labels._scorer.0._module._linear_layers.1.weight using .*weight initializer
2022-03-12 18:09:12,749 - INFO - allennlp.nn.initializers - Initializing _trigger_pruners.ace-event__trigger_labels._scorer.1._module.weight using .*weight initializer
2022-03-12 18:09:12,749 - INFO - allennlp.nn.initializers - Initializing _mention_pruners.ace-event__argument_labels._scorer.0._module._linear_layers.0.weight using .*weight initializer
2022-03-12 18:09:12,752 - INFO - allennlp.nn.initializers - Initializing _mention_pruners.ace-event__argument_labels._scorer.0._module._linear_layers.1.weight using .*weight initializer
2022-03-12 18:09:12,752 - INFO - allennlp.nn.initializers - Initializing _mention_pruners.ace-event__argument_labels._scorer.1._module.weight using .*weight initializer
2022-03-12 18:09:12,752 - INFO - allennlp.nn.initializers - Initializing _argument_feedforwards.ace-event__argument_labels._linear_layers.0.weight using .*weight initializer
2022-03-12 18:09:12,757 - INFO - allennlp.nn.initializers - Initializing _argument_feedforwards.ace-event__argument_labels._linear_layers.1.weight using .*weight initializer
2022-03-12 18:09:12,757 - INFO - allennlp.nn.initializers - Initializing _argument_scorers.ace-event__argument_labels.weight using .*weight initializer
2022-03-12 18:09:12,757 - INFO - allennlp.nn.initializers - Initializing _distance_embedding.weight using .*weight initializer
2022-03-12 18:09:12,757 - WARNING - allennlp.nn.initializers - Did not use initialization regex that was passed: .*weight_matrix
2022-03-12 18:09:12,757 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code
2022-03-12 18:09:12,757 - INFO - allennlp.nn.initializers -    _argument_feedforwards.ace-event__argument_labels._linear_layers.0.bias
2022-03-12 18:09:12,758 - INFO - allennlp.nn.initializers -    _argument_feedforwards.ace-event__argument_labels._linear_layers.1.bias
2022-03-12 18:09:12,758 - INFO - allennlp.nn.initializers -    _argument_scorers.ace-event__argument_labels.bias
2022-03-12 18:09:12,758 - INFO - allennlp.nn.initializers -    _mention_pruners.ace-event__argument_labels._scorer.0._module._linear_layers.0.bias
2022-03-12 18:09:12,758 - INFO - allennlp.nn.initializers -    _mention_pruners.ace-event__argument_labels._scorer.0._module._linear_layers.1.bias
2022-03-12 18:09:12,758 - INFO - allennlp.nn.initializers -    _mention_pruners.ace-event__argument_labels._scorer.1._module.bias
2022-03-12 18:09:12,758 - INFO - allennlp.nn.initializers -    _trigger_pruners.ace-event__trigger_labels._scorer.0._module._linear_layers.0.bias
2022-03-12 18:09:12,758 - INFO - allennlp.nn.initializers -    _trigger_pruners.ace-event__trigger_labels._scorer.0._module._linear_layers.1.bias
2022-03-12 18:09:12,758 - INFO - allennlp.nn.initializers -    _trigger_pruners.ace-event__trigger_labels._scorer.1._module.bias
2022-03-12 18:09:12,758 - INFO - allennlp.nn.initializers -    _trigger_scorers.ace-event__trigger_labels.0._module._linear_layers.0.bias
2022-03-12 18:09:12,758 - INFO - allennlp.nn.initializers -    _trigger_scorers.ace-event__trigger_labels.0._module._linear_layers.1.bias
2022-03-12 18:09:12,758 - INFO - allennlp.nn.initializers -    _trigger_scorers.ace-event__trigger_labels.1._module.bias
2022-03-12 18:09:12,758 - INFO - allennlp.nn.initializers - Initializing parameters
2022-03-12 18:09:12,758 - INFO - allennlp.nn.initializers - Initializing _endpoint_span_extractor._span_width_embedding.weight using _span_width_embedding.weight initializer
2022-03-12 18:09:12,761 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code
2022-03-12 18:09:12,761 - INFO - allennlp.nn.initializers -    _coref._antecedent_feedforward._module._linear_layers.0.bias
2022-03-12 18:09:12,761 - INFO - allennlp.nn.initializers -    _coref._antecedent_feedforward._module._linear_layers.0.weight
2022-03-12 18:09:12,761 - INFO - allennlp.nn.initializers -    _coref._antecedent_feedforward._module._linear_layers.1.bias
2022-03-12 18:09:12,761 - INFO - allennlp.nn.initializers -    _coref._antecedent_feedforward._module._linear_layers.1.weight
2022-03-12 18:09:12,761 - INFO - allennlp.nn.initializers -    _coref._antecedent_scorer._module.bias
2022-03-12 18:09:12,762 - INFO - allennlp.nn.initializers -    _coref._antecedent_scorer._module.weight
2022-03-12 18:09:12,762 - INFO - allennlp.nn.initializers -    _coref._distance_embedding.weight
2022-03-12 18:09:12,762 - INFO - allennlp.nn.initializers -    _coref._f_network._linear_layers.0.bias
2022-03-12 18:09:12,762 - INFO - allennlp.nn.initializers -    _coref._f_network._linear_layers.0.weight
2022-03-12 18:09:12,762 - INFO - allennlp.nn.initializers -    _coref._mention_pruner._scorer.0._module._linear_layers.0.bias
2022-03-12 18:09:12,762 - INFO - allennlp.nn.initializers -    _coref._mention_pruner._scorer.0._module._linear_layers.0.weight
2022-03-12 18:09:12,762 - INFO - allennlp.nn.initializers -    _coref._mention_pruner._scorer.0._module._linear_layers.1.bias
2022-03-12 18:09:12,762 - INFO - allennlp.nn.initializers -    _coref._mention_pruner._scorer.0._module._linear_layers.1.weight
2022-03-12 18:09:12,762 - INFO - allennlp.nn.initializers -    _coref._mention_pruner._scorer.1._module.bias
2022-03-12 18:09:12,762 - INFO - allennlp.nn.initializers -    _coref._mention_pruner._scorer.1._module.weight
2022-03-12 18:09:12,762 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.embeddings.LayerNorm.bias
2022-03-12 18:09:12,762 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.embeddings.LayerNorm.weight
2022-03-12 18:09:12,762 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.embeddings.position_embeddings.weight
2022-03-12 18:09:12,762 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.embeddings.token_type_embeddings.weight
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.embeddings.word_embeddings.weight
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.attention.output.LayerNorm.bias
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.attention.output.LayerNorm.weight
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.attention.output.dense.bias
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.attention.output.dense.weight
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.attention.self.key.bias
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.attention.self.key.weight
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.attention.self.query.bias
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.attention.self.query.weight
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.attention.self.value.bias
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.attention.self.value.weight
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.intermediate.dense.bias
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.intermediate.dense.weight
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.output.LayerNorm.bias
2022-03-12 18:09:12,763 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.output.LayerNorm.weight
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.output.dense.bias
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.0.output.dense.weight
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.attention.output.LayerNorm.bias
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.attention.output.LayerNorm.weight
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.attention.output.dense.bias
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.attention.output.dense.weight
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.attention.self.key.bias
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.attention.self.key.weight
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.attention.self.query.bias
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.attention.self.query.weight
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.attention.self.value.bias
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.attention.self.value.weight
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.intermediate.dense.bias
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.intermediate.dense.weight
2022-03-12 18:09:12,764 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.output.LayerNorm.bias
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.output.LayerNorm.weight
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.output.dense.bias
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.1.output.dense.weight
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.attention.output.LayerNorm.bias
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.attention.output.LayerNorm.weight
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.attention.output.dense.bias
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.attention.output.dense.weight
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.attention.self.key.bias
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.attention.self.key.weight
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.attention.self.query.bias
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.attention.self.query.weight
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.attention.self.value.bias
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.attention.self.value.weight
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.intermediate.dense.bias
2022-03-12 18:09:12,765 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.intermediate.dense.weight
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.output.LayerNorm.bias
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.output.LayerNorm.weight
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.output.dense.bias
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.10.output.dense.weight
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.attention.output.LayerNorm.bias
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.attention.output.LayerNorm.weight
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.attention.output.dense.bias
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.attention.output.dense.weight
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.attention.self.key.bias
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.attention.self.key.weight
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.attention.self.query.bias
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.attention.self.query.weight
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.attention.self.value.bias
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.attention.self.value.weight
2022-03-12 18:09:12,766 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.intermediate.dense.bias
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.intermediate.dense.weight
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.output.LayerNorm.bias
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.output.LayerNorm.weight
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.output.dense.bias
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.11.output.dense.weight
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.attention.output.LayerNorm.bias
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.attention.output.LayerNorm.weight
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.attention.output.dense.bias
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.attention.output.dense.weight
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.attention.self.key.bias
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.attention.self.key.weight
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.attention.self.query.bias
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.attention.self.query.weight
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.attention.self.value.bias
2022-03-12 18:09:12,767 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.attention.self.value.weight
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.intermediate.dense.bias
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.intermediate.dense.weight
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.output.LayerNorm.bias
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.output.LayerNorm.weight
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.output.dense.bias
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.2.output.dense.weight
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.attention.output.LayerNorm.bias
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.attention.output.LayerNorm.weight
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.attention.output.dense.bias
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.attention.output.dense.weight
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.attention.self.key.bias
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.attention.self.key.weight
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.attention.self.query.bias
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.attention.self.query.weight
2022-03-12 18:09:12,768 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.attention.self.value.bias
2022-03-12 18:09:12,769 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.attention.self.value.weight
2022-03-12 18:09:12,769 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.intermediate.dense.bias
2022-03-12 18:09:12,769 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.intermediate.dense.weight
2022-03-12 18:09:12,769 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.output.LayerNorm.bias
2022-03-12 18:09:12,769 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.output.LayerNorm.weight
2022-03-12 18:09:12,769 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.output.dense.bias
2022-03-12 18:09:12,769 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.3.output.dense.weight
2022-03-12 18:09:12,769 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.attention.output.LayerNorm.bias
2022-03-12 18:09:12,769 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.attention.output.LayerNorm.weight
2022-03-12 18:09:12,769 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.attention.output.dense.bias
2022-03-12 18:09:12,769 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.attention.output.dense.weight
2022-03-12 18:09:12,769 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.attention.self.key.bias
2022-03-12 18:09:12,769 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.attention.self.key.weight
2022-03-12 18:09:12,769 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.attention.self.query.bias
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.attention.self.query.weight
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.attention.self.value.bias
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.attention.self.value.weight
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.intermediate.dense.bias
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.intermediate.dense.weight
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.output.LayerNorm.bias
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.output.LayerNorm.weight
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.output.dense.bias
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.4.output.dense.weight
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.attention.output.LayerNorm.bias
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.attention.output.LayerNorm.weight
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.attention.output.dense.bias
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.attention.output.dense.weight
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.attention.self.key.bias
2022-03-12 18:09:12,770 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.attention.self.key.weight
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.attention.self.query.bias
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.attention.self.query.weight
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.attention.self.value.bias
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.attention.self.value.weight
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.intermediate.dense.bias
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.intermediate.dense.weight
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.output.LayerNorm.bias
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.output.LayerNorm.weight
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.output.dense.bias
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.5.output.dense.weight
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.attention.output.LayerNorm.bias
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.attention.output.LayerNorm.weight
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.attention.output.dense.bias
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.attention.output.dense.weight
2022-03-12 18:09:12,771 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.attention.self.key.bias
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.attention.self.key.weight
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.attention.self.query.bias
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.attention.self.query.weight
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.attention.self.value.bias
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.attention.self.value.weight
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.intermediate.dense.bias
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.intermediate.dense.weight
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.output.LayerNorm.bias
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.output.LayerNorm.weight
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.output.dense.bias
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.6.output.dense.weight
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.attention.output.LayerNorm.bias
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.attention.output.LayerNorm.weight
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.attention.output.dense.bias
2022-03-12 18:09:12,772 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.attention.output.dense.weight
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.attention.self.key.bias
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.attention.self.key.weight
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.attention.self.query.bias
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.attention.self.query.weight
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.attention.self.value.bias
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.attention.self.value.weight
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.intermediate.dense.bias
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.intermediate.dense.weight
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.output.LayerNorm.bias
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.output.LayerNorm.weight
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.output.dense.bias
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.7.output.dense.weight
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.attention.output.LayerNorm.bias
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.attention.output.LayerNorm.weight
2022-03-12 18:09:12,773 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.attention.output.dense.bias
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.attention.output.dense.weight
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.attention.self.key.bias
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.attention.self.key.weight
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.attention.self.query.bias
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.attention.self.query.weight
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.attention.self.value.bias
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.attention.self.value.weight
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.intermediate.dense.bias
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.intermediate.dense.weight
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.output.LayerNorm.bias
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.output.LayerNorm.weight
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.output.dense.bias
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.8.output.dense.weight
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.attention.output.LayerNorm.bias
2022-03-12 18:09:12,774 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.attention.output.LayerNorm.weight
2022-03-12 18:09:12,775 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.attention.output.dense.bias
2022-03-12 18:09:12,775 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.attention.output.dense.weight
2022-03-12 18:09:12,775 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.attention.self.key.bias
2022-03-12 18:09:12,775 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.attention.self.key.weight
2022-03-12 18:09:12,775 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.attention.self.query.bias
2022-03-12 18:09:12,775 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.attention.self.query.weight
2022-03-12 18:09:12,775 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.attention.self.value.bias
2022-03-12 18:09:12,775 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.attention.self.value.weight
2022-03-12 18:09:12,775 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.intermediate.dense.bias
2022-03-12 18:09:12,775 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.intermediate.dense.weight
2022-03-12 18:09:12,775 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.output.LayerNorm.bias
2022-03-12 18:09:12,775 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.output.LayerNorm.weight
2022-03-12 18:09:12,775 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.output.dense.bias
2022-03-12 18:09:12,775 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.encoder.layer.9.output.dense.weight
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.pooler.dense.bias
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _embedder.token_embedder_bert._matched_embedder.transformer_model.pooler.dense.weight
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _events._argument_feedforwards.ace-event__argument_labels._linear_layers.0.bias
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _events._argument_feedforwards.ace-event__argument_labels._linear_layers.0.weight
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _events._argument_feedforwards.ace-event__argument_labels._linear_layers.1.bias
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _events._argument_feedforwards.ace-event__argument_labels._linear_layers.1.weight
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _events._argument_scorers.ace-event__argument_labels.bias
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _events._argument_scorers.ace-event__argument_labels.weight
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _events._distance_embedding.weight
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _events._mention_pruners.ace-event__argument_labels._scorer.0._module._linear_layers.0.bias
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _events._mention_pruners.ace-event__argument_labels._scorer.0._module._linear_layers.0.weight
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _events._mention_pruners.ace-event__argument_labels._scorer.0._module._linear_layers.1.bias
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _events._mention_pruners.ace-event__argument_labels._scorer.0._module._linear_layers.1.weight
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _events._mention_pruners.ace-event__argument_labels._scorer.1._module.bias
2022-03-12 18:09:12,776 - INFO - allennlp.nn.initializers -    _events._mention_pruners.ace-event__argument_labels._scorer.1._module.weight
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _events._trigger_pruners.ace-event__trigger_labels._scorer.0._module._linear_layers.0.bias
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _events._trigger_pruners.ace-event__trigger_labels._scorer.0._module._linear_layers.0.weight
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _events._trigger_pruners.ace-event__trigger_labels._scorer.0._module._linear_layers.1.bias
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _events._trigger_pruners.ace-event__trigger_labels._scorer.0._module._linear_layers.1.weight
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _events._trigger_pruners.ace-event__trigger_labels._scorer.1._module.bias
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _events._trigger_pruners.ace-event__trigger_labels._scorer.1._module.weight
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _events._trigger_scorers.ace-event__trigger_labels.0._module._linear_layers.0.bias
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _events._trigger_scorers.ace-event__trigger_labels.0._module._linear_layers.0.weight
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _events._trigger_scorers.ace-event__trigger_labels.0._module._linear_layers.1.bias
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _events._trigger_scorers.ace-event__trigger_labels.0._module._linear_layers.1.weight
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _events._trigger_scorers.ace-event__trigger_labels.1._module.bias
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _events._trigger_scorers.ace-event__trigger_labels.1._module.weight
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _ner._ner_scorers.ace-event__ner_labels.0._module._linear_layers.0.bias
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _ner._ner_scorers.ace-event__ner_labels.0._module._linear_layers.0.weight
2022-03-12 18:09:12,777 - INFO - allennlp.nn.initializers -    _ner._ner_scorers.ace-event__ner_labels.0._module._linear_layers.1.bias
2022-03-12 18:09:12,778 - INFO - allennlp.nn.initializers -    _ner._ner_scorers.ace-event__ner_labels.0._module._linear_layers.1.weight
2022-03-12 18:09:12,778 - INFO - allennlp.nn.initializers -    _ner._ner_scorers.ace-event__ner_labels.1._module.bias
2022-03-12 18:09:12,778 - INFO - allennlp.nn.initializers -    _ner._ner_scorers.ace-event__ner_labels.1._module.weight
2022-03-12 18:09:12,778 - INFO - allennlp.nn.initializers -    _relation._mention_pruners.ace-event__relation_labels._scorer.0._module._linear_layers.0.bias
2022-03-12 18:09:12,778 - INFO - allennlp.nn.initializers -    _relation._mention_pruners.ace-event__relation_labels._scorer.0._module._linear_layers.0.weight
2022-03-12 18:09:12,778 - INFO - allennlp.nn.initializers -    _relation._mention_pruners.ace-event__relation_labels._scorer.0._module._linear_layers.1.bias
2022-03-12 18:09:12,778 - INFO - allennlp.nn.initializers -    _relation._mention_pruners.ace-event__relation_labels._scorer.0._module._linear_layers.1.weight
2022-03-12 18:09:12,778 - INFO - allennlp.nn.initializers -    _relation._mention_pruners.ace-event__relation_labels._scorer.1._module.bias
2022-03-12 18:09:12,778 - INFO - allennlp.nn.initializers -    _relation._mention_pruners.ace-event__relation_labels._scorer.1._module.weight
2022-03-12 18:09:12,778 - INFO - allennlp.nn.initializers -    _relation._relation_feedforwards.ace-event__relation_labels._linear_layers.0.bias
2022-03-12 18:09:12,778 - INFO - allennlp.nn.initializers -    _relation._relation_feedforwards.ace-event__relation_labels._linear_layers.0.weight
2022-03-12 18:09:12,778 - INFO - allennlp.nn.initializers -    _relation._relation_feedforwards.ace-event__relation_labels._linear_layers.1.bias
2022-03-12 18:09:12,778 - INFO - allennlp.nn.initializers -    _relation._relation_feedforwards.ace-event__relation_labels._linear_layers.1.weight
2022-03-12 18:09:12,778 - INFO - allennlp.nn.initializers -    _relation._relation_scorers.ace-event__relation_labels.bias
2022-03-12 18:09:12,779 - INFO - allennlp.nn.initializers -    _relation._relation_scorers.ace-event__relation_labels.weight
2022-03-12 18:09:15,188 - INFO - root - Loading a model trained before embedding extension was implemented; pass an explicit vocab namespace if you want to extend the vocabulary.
2022-03-12 18:09:15,189 - INFO - root - Loading a model trained before embedding extension was implemented; pass an explicit vocab namespace if you want to extend the vocabulary.
2022-03-12 18:09:15,190 - INFO - root - Loading a model trained before embedding extension was implemented; pass an explicit vocab namespace if you want to extend the vocabulary.
2022-03-12 18:09:15,666 - INFO - allennlp.common.params - dataset_reader.type = dygie
2022-03-12 18:09:15,666 - INFO - allennlp.common.params - dataset_reader.lazy = False
2022-03-12 18:09:15,666 - INFO - allennlp.common.params - dataset_reader.cache_directory = None
2022-03-12 18:09:15,666 - INFO - allennlp.common.params - dataset_reader.max_instances = None
2022-03-12 18:09:15,666 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False
2022-03-12 18:09:15,666 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False
2022-03-12 18:09:15,666 - INFO - allennlp.common.params - dataset_reader.max_span_width = 8
2022-03-12 18:09:15,667 - INFO - allennlp.common.params - dataset_reader.token_indexers.bert.type = pretrained_transformer_mismatched
2022-03-12 18:09:15,667 - INFO - allennlp.common.params - dataset_reader.token_indexers.bert.token_min_padding_length = 0
2022-03-12 18:09:15,667 - INFO - allennlp.common.params - dataset_reader.token_indexers.bert.model_name = roberta-base
2022-03-12 18:09:15,667 - INFO - allennlp.common.params - dataset_reader.token_indexers.bert.namespace = tags
2022-03-12 18:09:15,667 - INFO - allennlp.common.params - dataset_reader.token_indexers.bert.max_length = 512
reading instances: 0it [00:00, ?it/s]/root/SheShuaijie/workspace/dygiepp-master/dygie/data/dataset_readers/dygie.py:195: UserWarning: Document doc1 has a sentence with a single token or no tokens. This may break the modeling code.
  warnings.warn(msg)
/root/SheShuaijie/workspace/dygiepp-master/dygie/data/dataset_readers/dygie.py:195: UserWarning: Document doc2 has a sentence with a single token or no tokens. This may break the modeling code.
  warnings.warn(msg)
reading instances: 2it [00:00, 73.82it/s]
Ricardokevins commented 2 years ago

And my Data for predict

{"doc_key": "doc1", "dataset": "ace05-event", "sentences": [["Marseille", "prosecutor", "says", "\"", "so", "far", "no", "videos", "were", "used", "in", "the", "crash", "investigation", "\"", "despite", "media", "reports", "."], ["Journalists", "at", "Bild", "and", "Paris", "Match", "are", "\"", "very", "confident", "\"", "the", "video", "clip", "is", "real", ",", "an", "editor", "says", "."], ["Andreas", "Lubitz", "had", "informed", "his", "Lufthansa", "training", "school", "of", "an", "episode", "of", "severe", "depression", ",", "airline", "says", "."], ["\n", "Membership", "gives", "the", "ICC", "jurisdiction", "over", "alleged", "crimes", "committed", "in", "Palestinian", "territories", "since", "last", "June", "."], ["Israel", "and", "the", "United", "States", "opposed", "the", "move", ",", "which", "could", "open", "the", "door", "to", "war", "crimes", "investigations", "against", "Israelis", "."], ["\n", "Amnesty", "'s", "annual", "death", "penalty", "report", "catalogs", "encouraging", "signs", ",", "but", "setbacks", "in", "numbers", "of", "those", "sentenced", "to", "death", "."], ["Organization", "claims", "that", "governments", "around", "the", "world", "are", "using", "the", "threat", "of", "terrorism", "to", "advance", "executions", "."], ["The", "number", "of", "executions", "worldwide", "has", "gone", "down", "by", "almost", "22", "%", "compared", "with", "2013", ",", "but", "death", "sentences", "up", "by", "28", "%", "."], ["\n", "Amnesty", "International", "releases", "its", "annual", "review", "of", "the", "death", "penalty", "worldwide", ";", "much", "of", "it", "makes", "for", "grim", "reading", "."], ["Salil", "Shetty", ":", "Countries", "that", "use", "executions", "to", "deal", "with", "problems", "are", "on", "the", "wrong", "side", "of", "history", "."], ["\n", "Museum", ":", "Anne", "Frank", "died", "earlier", "than", "previously", "believed", "."], ["Researchers", "re", "-", "examined", "archives", "and", "testimonies", "of", "survivors", "."], ["Anne", "and", "older", "sister", "Margot", "Frank", "are", "believed", "to", "have", "died", "in", "February", "1945", "."], ["\n", "Student", "is", "no", "longer", "on", "Duke", "University", "campus", "and", "will", "face", "disciplinary", "review", "."], ["School", "officials", "identified", "student", "during", "investigation", "and", "the", "person", "admitted", "to", "hanging", "the", "noose", ",", "Duke", "says", "."], ["The", "noose", ",", "made", "of", "rope", ",", "was", "discovered", "on", "campus", "about", "2", "a.m."], ["\n", "The", "Rev.", "Robert", "Schuller", ",", "88", ",", "had", "been", "diagnosed", "with", "esophageal", "cancer", "in", "2013", "."], ["His", "TV", "show", ",", "\"", "Hour", "of", "Power", ",", "\"", "was", "enormously", "popular", "in", "the", "1970s", "and", "1980s", "."], ["\n", "Theia", ",", "a", "bully", "breed", "mix", ",", "was", "apparently", "hit", "by", "a", "car", ",", "whacked", "with", "a", "hammer", "and", "buried", "in", "a", "field", "."], ["\"", "She", "'s", "a", "true", "miracle", "dog", "and", "she", "deserves", "a", "good", "life", ",", "\"", "says", "Sara", "Mellado", ",", "who", "is", "looking", "for", "a", "home", "for", "Theia", "."], ["\n", "Mohammad", "Javad", "Zarif", "has", "spent", "more", "time", "with", "John", "Kerry", "than", "any", "other", "foreign", "minister", "."], ["He", "once", "participated", "in", "a", "takeover", "of", "the", "Iranian", "Consulate", "in", "San", "Francisco", "."], ["The", "Iranian", "foreign", "minister", "tweets", "in", "English", "."], ["\n", "Bob", "Barker", "returned", "to", "host", "\"", "The", "Price", "Is", "Right", "\"", "on", "Wednesday", "."], ["Barker", ",", "91", ",", "had", "retired", "as", "host", "in", "2007", "."], ["\n", "College", "-", "bound", "basketball", "star", "asks", "girl", "with", "Down", "syndrome", "to", "high", "school", "prom", "."], ["Pictures", "of", "the", "two", "during", "the", "\"", "prom", "-", "posal", "\"", "have", "gone", "viral", "."], ["\n"], ["Former", "GOP", "representative", "compares", "President", "Obama", "to", "Andreas", "Lubitz", "."], ["Bachmann", "said", "with", "possible", "Iran", "deal", ",", "Obama", "will", "fly", "\"", "entire", "nation", "into", "the", "rocks", "\"", "Reaction", "on", "social", "media", "?"], ["She", "was", "blasted", "by", "Facebook", "commenters", "."], ["\n"]]}
{"doc_key": "doc2", "dataset": "ace05-event", "sentences": [["Americans", "paid", "more", "for", "some", "fruits", "and", "vegetables", "last", "year", "because", "of", "the", "drought", "."], ["Tourists", "will", "now", "have", "to", "ask", "for", "a", "glass", "of", "water", "at", "a", "California", "restaurant", "."], ["Perhaps", "the", "only", "good", "thing", "is", "another", "\"", "great", "\"", "wine", "grape", "harvest", "last", "year", "."], ["\n"], ["While", "Republican", "Gov.", "Asa", "Hutchinson", "was", "weighing", "an", "Arkansas", "religious", "freedom", "bill", ",", "Walmart", "voiced", "its", "opposition", "."], ["Walmart", "and", "other", "high", "-", "profile", "businesses", "are", "showing", "their", "support", "for", "gay", "and", "lesbian", "rights", "."], ["Their", "stance", "puts", "them", "in", "conflict", "with", "socially", "conservative", "Republicans", ",", "traditionally", "seen", "as", "allies", "."], ["\n", "17", "Americans", "were", "exposed", "to", "the", "Ebola", "virus", "while", "in", "Sierra", "Leone", "in", "March", "."], ["Another", "person", "was", "diagnosed", "with", "the", "disease", "and", "taken", "to", "hospital", "in", "Maryland", "."], ["National", "Institutes", "of", "Health", "says", "the", "patient", "is", "in", "fair", "condition", "after", "weeks", "of", "treatment", "."], ["\n"], ["Andrew", "Getty", "'s", "death", "appears", "to", "be", "from", "natural", "causes", ",", "police", "say", ",", "citing", "coroner", "'s", "early", "assessment", "."], ["In", "a", "petition", "for", "a", "restraining", "order", ",", "Getty", "had", "written", "he", "had", "a", "serious", "medical", "condition", "."], ["Police", "say", "this", "is", "not", "a", "criminal", "matter", "at", "this", "time", "."], ["\n", "LZ", ":", "Indiana", "law", "pushing", "back", "LGBT", "rights", ",", "and", "other", "states", "'", "anti", "-", "LGBT", "moves", ",", "bow", "to", "far", "right", "wing", "that", "GOP", "candidates", "need", "for", "2016", "."], ["Cruz", ",", "Huckabee", ",", "Jindal", ",", "Carson", ",", "Walker", "are", "reviving", "culture", "wars", ",", "he", "says", "."], [" ", "Equality", "for", "LGBT", "has", "not", "yet", "\"", "won", "\"", "in", "America", "."], ["\n"], ["Once", "a", "super", "typhoon", ",", "Maysak", "is", "now", "a", "tropical", "storm", "with", "70", "mph", "winds", "."], ["It", "could", "still", "cause", "flooding", ",", "landslides", "and", "other", "problems", "in", "the", "Philippines", "."], ["\n", "Father", ":", "\"", "I", "know", "he", "went", "through", "what", "he", "went", "through", "\"", "Louis", "Jordan", "was", "found", "on", "his", "sailboat", ",", "which", "was", "listing", "and", "in", "bad", "shape", ",", "rescuer", "says", "."], ["He", "appears", "to", "be", "in", "good", "shape", ",", "physically", "and", "mentally", "."], ["\n", "\"", "Furious", "7", "\"", "pays", "tribute", "to", "star", "Paul", "Walker", ",", "who", "died", "during", "filming", "."], ["Vin", "Diesel", ":", "\"", "This", "movie", "is", "more", "than", "a", "movie", "\"", "\"", "Furious", "7", "\"", "opens", "Friday", "."]]}
Ricardokevins commented 2 years ago

Any Suggestion is highly Appreciate ~ I am still trying to Fix problem

Ricardokevins commented 2 years ago

one more thing to mention. when I run the code, It hit error with "Roberta-base' is not a identifier in huggingface model. So i download the model weight manully and modified the path to my local path. I not sure if it may cause the ERROR

Ricardokevins commented 2 years ago

ohhhhhhhhhh!!!!!

I fix this problem from history issue!

I create the test data with attribute "dataset" : ace05-event which cause the ERROR

When I read this issue https://github.com/dwadden/dygiepp/issues/80. I try to replace ace05-event with ace-event and fix the ERROR!

What's more this issue https://github.com/dwadden/dygiepp/issues/18 's data format is wrong which will cause same ERROR.

Ricardokevins commented 2 years ago

The Data format Will Cause ERROR:

{'doc_key': '0', 'sentences': [['Coronavirus', 'Thwarts', 'Rescue', 'of', 'Endangered', 'Albatrosses', 'Menaced', 'by', 'Giant', 'Mice'], ['The', 'covid-19', 'pandemic', 'has', 'forced', 'a', 'delay', 'to', 'an', 'effort', 'to', 'protect', 'vulnerable', 'seabirds', 'from', 'large,', 'invasive', 'mice', 'on', 'an', 'island', 'in', 'the', 'South', 'Atlantic', 'Ocean.']]}

And this

{"doc_key": "doc1", "dataset": "ace05-event", "sentences": [["Marseille", "prosecutor", "says", "\"", "so", "far", "no", "videos", "were", "used", "in", "the", "crash", "investigation", "\"", "despite", "media", "reports", "."], ["Journalists", "at", "Bild", "and", "Paris", "Match", "are", "\"", "very", "confident", "\"", "the", "video", "clip", "is", "real", ",", "an", "editor", "says", "."], ["Andreas", "Lubitz", "had", "informed", "his", "Lufthansa", "training", "school", "of", "an", "episode", "of", "severe", "depression", ",", "airline", "says", "."], ["\n", "Membership", "gives", "the", "ICC", "jurisdiction", "over", "alleged", "crimes", "committed", "in", "Palestinian", "territories", "since", "last", "June", "."], ["Israel", "and", "the", "United", "States", "opposed", "the", "move", ",", "which", "could", "open", "the", "door", "to", "war", "crimes", "investigations", "against", "Israelis", "."], ["\n", "Amnesty", "'s", "annual", "death", "penalty", "report", "catalogs", "encouraging", "signs", ",", "but", "setbacks", "in", "numbers", "of", "those", "sentenced", "to", "death", "."], ["Organization", "claims", "that", "governments", "around", "the", "world", "are", "using", "the", "threat", "of", "terrorism", "to", "advance", "executions", "."], ["The", "number", "of", "executions", "worldwide", "has", "gone", "down", "by", "almost", "22", "%", "compared", "with", "2013", ",", "but", "death", "sentences", "up", "by", "28", "%", "."], ["\n", "Amnesty", "International", "releases", "its", "annual", "review", "of", "the", "death", "penalty", "worldwide", ";", "much", "of", "it", "makes", "for", "grim", "reading", "."], ["Salil", "Shetty", ":", "Countries", "that", "use", "executions", "to", "deal", "with", "problems", "are", "on", "the", "wrong", "side", "of", "history", "."], ["\n", "Museum", ":", "Anne", "Frank", "died", "earlier", "than", "previously", "believed", "."], ["Researchers", "re", "-", "examined", "archives", "and", "testimonies", "of", "survivors", "."], ["Anne", "and", "older", "sister", "Margot", "Frank", "are", "believed", "to", "have", "died", "in", "February", "1945", "."], ["\n", "Student", "is", "no", "longer", "on", "Duke", "University", "campus", "and", "will", "face", "disciplinary", "review", "."], ["School", "officials", "identified", "student", "during", "investigation", "and", "the", "person", "admitted", "to", "hanging", "the", "noose", ",", "Duke", "says", "."], ["The", "noose", ",", "made", "of", "rope", ",", "was", "discovered", "on", "campus", "about", "2", "a.m."], ["\n", "The", "Rev.", "Robert", "Schuller", ",", "88", ",", "had", "been", "diagnosed", "with", "esophageal", "cancer", "in", "2013", "."], ["His", "TV", "show", ",", "\"", "Hour", "of", "Power", ",", "\"", "was", "enormously", "popular", "in", "the", "1970s", "and", "1980s", "."], ["\n", "Theia", ",", "a", "bully", "breed", "mix", ",", "was", "apparently", "hit", "by", "a", "car", ",", "whacked", "with", "a", "hammer", "and", "buried", "in", "a", "field", "."], ["\"", "She", "'s", "a", "true", "miracle", "dog", "and", "she", "deserves", "a", "good", "life", ",", "\"", "says", "Sara", "Mellado", ",", "who", "is", "looking", "for", "a", "home", "for", "Theia", "."], ["\n", "Mohammad", "Javad", "Zarif", "has", "spent", "more", "time", "with", "John", "Kerry", "than", "any", "other", "foreign", "minister", "."], ["He", "once", "participated", "in", "a", "takeover", "of", "the", "Iranian", "Consulate", "in", "San", "Francisco", "."], ["The", "Iranian", "foreign", "minister", "tweets", "in", "English", "."], ["\n", "Bob", "Barker", "returned", "to", "host", "\"", "The", "Price", "Is", "Right", "\"", "on", "Wednesday", "."], ["Barker", ",", "91", ",", "had", "retired", "as", "host", "in", "2007", "."], ["\n", "College", "-", "bound", "basketball", "star", "asks", "girl", "with", "Down", "syndrome", "to", "high", "school", "prom", "."], ["Pictures", "of", "the", "two", "during", "the", "\"", "prom", "-", "posal", "\"", "have", "gone", "viral", "."], ["\n"], ["Former", "GOP", "representative", "compares", "President", "Obama", "to", "Andreas", "Lubitz", "."], ["Bachmann", "said", "with", "possible", "Iran", "deal", ",", "Obama", "will", "fly", "\"", "entire", "nation", "into", "the", "rocks", "\"", "Reaction", "on", "social", "media", "?"], ["She", "was", "blasted", "by", "Facebook", "commenters", "."], ["\n"]]}

Correct Format

{"dataset": "ace-event","doc_key": "0", "sentences": [["Coronavirus", "Thwarts", "Rescue", "of", "Endangered", "Albatrosses", "Menaced", "by", "Giant", "Mice"], ["The", "covid-19", "pandemic", "has", "forced", "a", "delay", "to", "an", "effort", "to", "protect", "vulnerable", "seabirds", "from", "large", "invasive", "mice", "on", "an", "island", "in", "the", "South", "Atlantic", "Ocean."]]}
Ricardokevins commented 2 years ago

It`s quite Confused while the dataset will cause the output result ERROR

dwadden commented 2 years ago

Hi, are you now able to get predictions out of the model? Are there any points you'd like clarified?

Ricardokevins commented 2 years ago

Hi, are you now able to get predictions out of the model? Are there any points you'd like clarified?

Thank you for your reply!

Hi I notice that in ACE dataset, An Event can be extract with TIME Argument. However in here's prediction I didn't see any TIME argument ( PERSON and ORG can be predicted) . Is it normal?

dwadden commented 2 years ago

I believe the pretrained model wasn't trained with TIME as an option. You can re-process the data with the appropriate flag set, and then train your own model if you'd like TIME arguments included.

Ricardokevins commented 2 years ago

I believe the pretrained model wasn't trained with TIME as an option. You can re-process the data with the appropriate flag set, and then train your own model if you'd like TIME arguments included.

Thank you so much~

dwadden commented 2 years ago

No problem, I hope it works!