Unipisa / diaparser

Direct Attentive Dependency Parser
MIT License
51 stars 20 forks source link

Error when using the model for Dutch #8

Closed md975 closed 3 years ago

md975 commented 3 years ago

Hello,

Thanks for making the parser available! I'm using this on a few languages, and it works fine...but when I try to use the model for Dutch (within the same pipeline and environment that works fine for other models):

parser = Parser.load('nl_alpino_lassysmall.wietsedv')

I get the following error:

Traceback (most recent call last): lib/python3.7/site-packages/diaparser/parsers/parser.py", line 263, in load model.load_state_dict(state['state_dict'], False) File "lib/python3.7/site-packages/torch/nn/modules/module.py", line 1052, in load_state_dict self.__class__.__name__, "\n\t".join(error_msgs))) RuntimeError: Error(s) in loading state_dict for BiaffineDependencyModel: size mismatch for feat_embed.bert.embeddings.word_embeddings.weight: copying a param with shape torch.Size([30000, 768]) from checkpoint, the shape in current model is torch.Size([30073, 768]).

Can you please help me with this?

Thanks

satriowputra commented 3 years ago

@md975 I got the same error when using Transformers 4.8.1. But if we use Transformers 3.1 the problem is gone and we can use the pretrained Diaparser model. Use pip install transformers==3.1 to install the correct Transformers version.

md975 commented 3 years ago

@satriowputra this worked, thanks!