utterworks / fast-bert

Super easy library for BERT based NLP models
Apache License 2.0
1.85k stars 342 forks source link

Using multiple training instances in AWS Sagemaker. #207

Open nectario opened 4 years ago

nectario commented 4 years ago

Is it possible to speedup BERT training by using multiple training instances?

kaushaltrivedi commented 4 years ago

You can use p3.8xlarge and above for parallel processing across multiple gpus

kaushaltrivedi commented 4 years ago

You need to set multi_gpu flag to true

nectario commented 4 years ago

Thank you. Which config file do I set this at?

aaronbriel commented 4 years ago

@nectario You would set this in the initialization of BertDataBunch.