grammatical / pretraining-bea2019

Models, system configurations and outputs of our winning GEC systems in the BEA 2019 shared task described in R. Grundkiewicz, M. Junczys-Dowmunt, K. Heafield: Neural Grammatical Error Correction Systems with Unsupervised Pre-training on Synthetic Data, BEA 2019.
MIT License
50 stars 8 forks source link

How to train the model? #7

Closed mirfan899 closed 1 year ago

mirfan899 commented 3 years ago

There is no documentation on how to train the model.

saramoeini20 commented 1 year ago

There is no documentation on how to train the model.

Hi. did you understand how to train it?