grammatical / pretraining-bea2019

Models, system configurations and outputs of our winning GEC systems in the BEA 2019 shared task described in R. Grundkiewicz, M. Junczys-Dowmunt, K. Heafield: Neural Grammatical Error Correction Systems with Unsupervised Pre-training on Synthetic Data, BEA 2019.
MIT License
50 stars 8 forks source link

The output of the dev and test set #2

Closed GitHub-v7 closed 5 years ago

GitHub-v7 commented 5 years ago

Hi guys, I fear that I cannot reproduce your impressive results as mentioned in your paper. May I know if your team can share the dev and test set outputs please? Thank you.