awasthiabhijeet / PIE

Fast + Non-Autoregressive Grammatical Error Correction using BERT. Code and Pre-trained models for paper "Parallel Iterative Edit Models for Local Sequence Transduction": www.aclweb.org/anthology/D19-1435.pdf (EMNLP-IJCNLP 2019)
MIT License
228 stars 40 forks source link

Which part of the code reflects the synthetic training? #27

Closed Sry2016 closed 3 years ago

awasthiabhijeet commented 3 years ago

Hi @Sry2016,

There is no separate code for synthetic training. We use the same program to train on synthetic data and real data.

errorify dir contains code for synthetic data generation. (Please refer to README if you wish to download the generated synthetic data)