awasthiabhijeet / PIE

Fast + Non-Autoregressive Grammatical Error Correction using BERT. Code and Pre-trained models for paper "Parallel Iterative Edit Models for Local Sequence Transduction": www.aclweb.org/anthology/D19-1435.pdf (EMNLP-IJCNLP 2019)
MIT License
227 stars 40 forks source link

Pretrained model bad correction performance? #11

Closed raff7 closed 4 years ago

raff7 commented 4 years ago

Hi, I am testing out your model and i noticed that if i run your pretrained model on the conll_test.txt file you have it get very poor performances. the output is something of the sort:

Day I think I think I think remember I think I think I
Day What I think is here and I think and I
Day I think large refers every day large your every day chance every day of I think every daying every day large I think I am disease every day every day

which has no resemblance with the input. do you maybe know what might be going on? I am just running multi_round_infer.sh. the only flags i have changed are "use_tpu" to false (cause i dont have a tpu and im runnign on gpu).

awasthiabhijeet commented 4 years ago

Hi! It seems that the checkpoint is not loaded. What was the original input sentence in this case?

Are you using the scrips provided here ?

raff7 commented 4 years ago

oh i see, i was running the script in "example_scripts" instad of the one in the checkpoint folder. thanks for your quick help.