Riccorl / transformer-srl

Reimplementation of a BERT based model (Shi et al, 2019), currently the state-of-the-art for English SRL. This model implements also predicate disambiguation.
69 stars 9 forks source link

Error after upgrading to v2.4.6 #6

Closed logicReasoner closed 3 years ago

logicReasoner commented 3 years ago

@Riccorl , hi!

I have upgraded to https://github.com/Riccorl/transformer-srl/archive/2.4.6.tar.gz and have downloaded the pretrained model srl_bert_base_conll2012.tar.gz from https://www.dropbox.com/s/4tes6ypf2do0feb/srl_bert_base_conll2012.tar.gz but now I am getting the following error during initialization:

2021-01-11 09:17:25 INFO     allennlp.nn.initializers  -    transformer.encoder.layer.9.output.dense.weight
I0111 09:17:25.121117 140505917486912 initializers.py:508]    transformer.pooler.dense.bias
2021-01-11 09:17:25 INFO     allennlp.nn.initializers  -    transformer.pooler.dense.bias
I0111 09:17:25.121154 140505917486912 initializers.py:508]    transformer.pooler.dense.weight
2021-01-11 09:17:25 INFO     allennlp.nn.initializers  -    transformer.pooler.dense.weight
I0111 09:17:25.315172 140505917486912 archival.py:211] removing temporary unarchived model dir at /tmp/tmp7vamph68
2021-01-11 09:17:25 INFO     allennlp.models.archival  - removing temporary unarchived model dir at /tmp/tmp7vamph68
Traceback (most recent call last):
  File "/home/dev/.local/bin/rasa", line 10, in <module>
    sys.exit(main())
  File "/home/dev/.local/lib/python3.6/site-packages/rasa/__main__.py", line 76, in main
    cmdline_arguments.func(cmdline_arguments)
  File "/home/dev/.local/lib/python3.6/site-packages/rasa/cli/run.py", line 88, in run
    rasa.run(**vars(args))
  File "/home/dev/.local/lib/python3.6/site-packages/rasa/run.py", line 33, in run
    import rasa.core.run
  File "/home/dev/.local/lib/python3.6/site-packages/rasa/core/run.py", line 22, in <module>
    from rasa.server import add_root_route
  File "/home/dev/.local/lib/python3.6/site-packages/rasa/server.py", line 76, in <module>
    srlPredictor = predictors.SrlTransformersPredictor.from_path("~/srl_bert_base_conll2012.tar.gz", "transformer_srl")
  File "/home/dev/nluFiles/transformer_srl/predictors.py", line 156, in from_path
    load_archive(archive_path, cuda_device=cuda_device),
  File "/home/dev/.local/lib/python3.6/site-packages/allennlp/models/archival.py", line 208, in load_archive
    model = _load_model(config.duplicate(), weights_path, serialization_dir, cuda_device)
  File "/home/dev/.local/lib/python3.6/site-packages/allennlp/models/archival.py", line 246, in _load_model
    cuda_device=cuda_device,
  File "/home/dev/.local/lib/python3.6/site-packages/allennlp/models/model.py", line 406, in load
    return model_class._load(config, serialization_dir, weights_file, cuda_device)
  File "/home/dev/.local/lib/python3.6/site-packages/allennlp/models/model.py", line 326, in _load
    missing_keys, unexpected_keys = model.load_state_dict(model_state, strict=False)
  File "/home/dev/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 1052, in load_state_dict
    self.__class__.__name__, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for TransformerSrlSpan:
        size mismatch for frame_projection_layer.weight: copying a param with shape torch.Size([5497, 768]) from checkpoint, the shape in current model is torch.Size([5929, 768]).
        size mismatch for frame_projection_layer.bias: copying a param with shape torch.Size([5497]) from checkpoint, the shape in current model is torch.Size([5929]).

What am I missing?

logicReasoner commented 3 years ago

Additionally, here is a list of the installed Python 3.6 packages: https://pastebin.com/E70UWGuQ

Riccorl commented 3 years ago

Additionally, here is a list of the installed Python 3.6 packages: https://pastebin.com/E70UWGuQ

The list shows that you are using the 2.4.11 version. I think that is the problem. It changes a lot from one version to another since I'm experimenting stuff for my thesis.

logicReasoner commented 3 years ago

I've downgraded transformer-srl to 2.4.6 (using pip3) and I am still getting the same error. I'll try installing everything from scratch just in case.

logicReasoner commented 3 years ago

BTW, do I also need a different (newer?) version of bert-base-cased with v2.4.6? This is the only piece of software that I haven't upgraded at this point? If yes, then where do I get it?

Riccorl commented 3 years ago

BTW, do I also need a different (newer?) version of bert-base-cased with v2.4.6? This is the only piece of software that I haven't upgraded at this point? If yes, then where do I get it?

You shouldn't. When you install the v2.4.6 it installs also the dependencies it needs. It should be ready out of the box. Do you still have issues?

logicReasoner commented 3 years ago

Nah, I was just wondering. Thanks

On Wed, Jan 13, 2021, 12:02 Riccardo Orlando notifications@github.com wrote:

BTW, do I also need a different (newer?) version of bert-base-cased with v2.4.6? This is the only piece of software that I haven't upgraded at this point? If yes, then where do I get it?

You shouldn't. When you install the v2.4.6 it installs also the dependencies it needs. It should be ready out of the box. Do you still have issues?

— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub https://github.com/Riccorl/transformer-srl/issues/6#issuecomment-759341931, or unsubscribe https://github.com/notifications/unsubscribe-auth/AQBYKS2W7JWNIO7LCSLSEE3SZVVSXANCNFSM4V5ITGEQ .