Cartus / AMR-Parser

Better Transition-Based AMR Parsing with a Refined Search Space (authors' DyNet implementation for the EMNLP18 paper)
https://aclweb.org/anthology/papers/D/D18/D18-1198/
MIT License
10 stars 0 forks source link

Parsing with a trained parser: load the pre-trained model #5

Open chenfeng15a opened 4 years ago

chenfeng15a commented 4 years ago

Hi, Cartus: When I run:

python3 train.py --load_model 1 --model result/pretrain14.model --dev_file data/test.transitions --gold_AMR_dev ../jamr/scripts/test/amr.txt

Bug occurs:

[dynet] random seed: 615808844 [dynet] allocating memory: 512MB [dynet] memory allocation done. 0... done reading training actions file

sents: 2

tokens: 46

types: 36

POStags: 13

actions: 42

preds: 35

0... done reading training actions file

sents: 2

tokens: 29

types: 36

POStags: 19

actions: 43

action types: 8

preds: 36

Start loading preds

lemmas with associated PR acts = 25

creating dictionary of pretrain embeddings finish creating UNK index in tok_dict_pretrain: 0 UNK index in tok_dict_all: 0 0... done loading raw sent

sents: 2

0... done loading raw sent

sents: 2

Rand word embedding size: 36 Pretrained word embedding size: 55 loading pretrained word embeddings finish loading loading pretrained model Traceback (most recent call last): File "train.py", line 227, in parser.load_model(args.model) File "/home/chenfeng/parser/AMR-cartus/model/stack_lstm.py", line 109, in load_model self.pc.populate(filename) File "_dynet.pyx", line 1461, in _dynet.ParameterCollection.populate File "_dynet.pyx", line 1516, in _dynet.ParameterCollection.populate_from_textfile RuntimeError: Dimensions of parameter /_24 looked up from file ({8557,200}) do not match parameters to be populated ({44,200})

Asking for your help. Thanks a lot!

Cartus commented 4 years ago

I see. That should be a bug. I am afraid that I am not able to fix it now since I am working for the deadline. Hopefully, I can fix these bugs and release a new pre-trained model in late December.

Sorry for the inconvenience caused.

chenfeng15a commented 4 years ago

Ok, thanks a lot! Hope you finish your work perfectly as soon as possible~

发自我的iPhone

------------------ Original ------------------ From: Cartus <notifications@github.com> Date: Fri,Nov 29,2019 4:23 AM To: Cartus/AMR-Parser <AMR-Parser@noreply.github.com> Cc: chenfeng15a <2441688980@qq.com>, Author <author@noreply.github.com> Subject: Re: [Cartus/AMR-Parser] Parsing with a trained parser: load the pre-trained model (#5)

I see. That should be a bug. I am afraid that I am not able to fix it now since I am working for the deadline. Hopefully, I can fix these bugs and release a new pre-trained model in late December.

Sorry for the inconvenience caused.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.