GoogleCloudPlatform / ai-infra-cluster-provisioning

Apache License 2.0
37 stars 25 forks source link

Adding a new config parameter to combine layers during FSDP #360

Open tejasnagendra opened 7 months ago

tejasnagendra commented 7 months ago

Created a new class called MultiBlock, which wraps around multiple Block to reduce the number of NCCL communication. Number of blocks to combine can be controlled with num_blocks_to_combine parameter GPT/Llama.

Ideally the message size should be around 1GB to get the best performance. But when models are running on multiple nodes every layer is split into really small chunks causing the message size to be extremely small resulting in suboptimal usage of network bandwidth. This parameter can be controlled to make sure we send larger message sizes.

google-cla[bot] commented 7 months ago

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.