Closed allenyummy closed 5 years ago
All of our results were obtained by fine-tuning on TPUs, so multi-GPU usage is not supported.
If you're interested in multi-GPU training for BERT, the transformers package from Hugging Face is a great resource for doing so.
Hi,
I add CUDA_VISIBLE_DEVICES=0,1,2 in front of the python run_commonsense_qa.py, and it is obvious that it still run the script with single GPU. (run nvidia-smi to check GPU usage)
How to run the fine tune script with multiple GPUs ?
Thanks for reply.