dmlc / gluon-nlp

NLP made easy
https://nlp.gluon.ai/
Apache License 2.0
2.56k stars 538 forks source link

[BERT] Multi-GPU support for BERT finetuning scripts #524

Open eric-haibin-lin opened 5 years ago

eric-haibin-lin commented 5 years ago

Currently, the BERT finetuning scripts for MRPC and SQuAD both only use a single GPU. It would be great to enhance the script so that multiple GPUs can be used to accelerate training.

I am working on pre-training script with multi-GPU option, if someone can add similar capability to the finetuning script, that will be great.

Gpwner commented 5 years ago

Does it support Multiple-GPU support for finetune now? When I come to this issue ,I feel a little disappointed. But anyway this project support multiple GPU for pretrain,it is awesome!