Riccorl / transformer-srl

Reimplementation of a BERT based model (Shi et al, 2019), currently the state-of-the-art for English SRL. This model implements also predicate disambiguation.
69 stars 9 forks source link

Error while loading model #18

Open LeonHammerla opened 2 years ago

LeonHammerla commented 2 years ago

When i try the Example or my own model i get the following Error while loading the model:

predictor = predictors.SrlTransformersPredictor.from_path("/path/to/model/srl_bert_base_conll2012.tar.gz", "transformer_srl")

==> for my dependency-model, for the example it changes to transformer_srl_span

allennlp.common.checks.ConfigurationError: transformer_srl_dependency not in acceptable choices for dataset_reader.type: ['babi', 'conll2003', 'interleaving', 'multitask', 'sequence_tagging', 'sharded', 'text_classification_json', 'multitask_shim', 'ptb_trees', 'semantic_dependencies', 'srl', 'universal_dependencies', 'sst_tokens', 'coref', 'preco', 'winobias', 'masked_language_modeling', 'next_token_lm', 'simple_language_modeling', 'copynet_seq2seq', 'seq2seq', 'cnn_dm', 'swag', 'commonsenseqa', 'piqa', 'fake', 'quora_paraphrase', 'snli', 'drop', 'qangaroo', 'quac', 'squad', 'squad1', 'squad2', 'transformer_squad', 'triviaqa', 'ccgbank', 'conll2000', 'ontonotes_ner', 'gqa', 'vqav2', 'visual-entailment']. You should either use the --include-package flag to make sure the correct module is loaded, or use a fully qualified class name in your config file like {"model": "my_module.models.MyModel"} to have it imported automatically.

Lisa-aa commented 11 months ago

I have the same issue. Did you manage to fix this? @LeonHammerla

Lisa-aa commented 11 months ago

I found out that this issue is caused when you do not have the entire import: from transformer_srl import dataset_readers, models, predictors The compiler says dataset_readers and models are unused, but they are necessary. @LeonHammerla