Cartus / AMR-Parser

Better Transition-Based AMR Parsing with a Refined Search Space (authors' DyNet implementation for the EMNLP18 paper)
https://aclweb.org/anthology/papers/D/D18/D18-1198/
MIT License
10 stars 0 forks source link

Load pretrained model #4

Closed rafaelanchieta closed 5 years ago

rafaelanchieta commented 5 years ago

Hi. When I run this command: python3 train.py --load_model 1 --model result/pretrain15.model --dev_file data/test.transitions --gold_AMR_dev data/amr/tmp_amr/test/amr.txt

I get this error:

Traceback (most recent call last): File "train.py", line 227, in parser.load_model(args.model) File "/home/rafael/Documentos/AMR-Parser/model/stack_lstm.py", line 109, in load_model self.pc.populate(filename) File "_dynet.pyx", line 1461, in _dynet.ParameterCollection.populate File "_dynet.pyx", line 1516, in _dynet.ParameterCollection.populate_from_textfile RuntimeError: Dimensions of parameter /_24 looked up from file ({11735,200}) do not match parameters to be populated ({11733,200})

Cartus commented 5 years ago

It seems that the data preprocessing affect the vocab of the training data, the previous pretained model cannot be used after I updated the code of data preprocessing. I will fix the bugs for AMR2017 corpus first then release the new pretrained model.

Sorry for the inconvenience caused. I will update the repo as soon as possible.