Currently, the BERT finetuning scripts for MRPC and SQuAD both only use a single GPU. It would be great to enhance the script so that multiple GPUs can be used to accelerate training.
I am working on pre-training script with multi-GPU option, if someone can add similar capability to the finetuning script, that will be great.
Does it support Multiple-GPU support for finetune now?
When I come to this issue ,I feel a little disappointed.
But anyway this project support multiple GPU for pretrain,it is awesome!
Currently, the BERT finetuning scripts for MRPC and SQuAD both only use a single GPU. It would be great to enhance the script so that multiple GPUs can be used to accelerate training.
I am working on pre-training script with multi-GPU option, if someone can add similar capability to the finetuning script, that will be great.