jcyk / copyisallyouneed

Code for our ACL2021 paper Neural Machine Translation with Monolingual Translation Memory
82 stars 12 forks source link

Accuracy inquiry #5

Closed rangehow closed 3 years ago

rangehow commented 3 years ago

Problem: I ran the pretrain.py to generate a response encoder whose acc is about 0.2 while your filename in shell like 'epoch78_batch99999_acc0.99' indicates that you gain a pre-training model with nearly full performance. Detail Experiment: I test this pretrain on WMT14(EN2DE) dataset the same as your shell. Using Subword-Nmt to preprocess the dataset and using news2013 as dev.txt. Simultaneously,I haven't change any parameters set in your pretrain.sh.

Would you like to help me fix the problem? Sincerely waiting for your guide to reproduce the work at the same time.

jcyk commented 3 years ago

Hi @rangehow, Pls check the updated repo and follow the instructions.