grammarly / gector

Official implementation of the papers "GECToR – Grammatical Error Correction: Tag, Not Rewrite" (BEA-20) and "Text Simplification by Tagging" (BEA-21)
Apache License 2.0
901 stars 213 forks source link

What should I update if I want to do distributed training? #168

Open xiuzhilu opened 2 years ago

xiuzhilu commented 2 years ago

Hi, dear. Thank you for your sharing. According to the code you gave when I used multi-GPU training, it is equivalent to torch.nn. data_parallel. If I want to achieve distributed training to achieve torch.distributed effect. What changes do I need to make. @skurzhanskyi @komelianchuk

skurzhanskyi commented 1 year ago

As the repository uses AllenNLP 0.8.4, we are limited with the functionality of the library