This new version of BMTrain introduces Tensor Parallelism and a significant restructuring of the codebase. We have enhanced flexibility by allowing for more granular control over ZeRO levels and Activation Checkpointing. This control is now achievable at the bmt.Block (alias of bmt.CheckpointBlock) level, enabling targeted optimizations. For instance, it is now possible to apply aggressive ZeRO settings or disable Checkpointing in specific layers as needed. Additionally, the updated BMTrain features a suite of operators designed explicitly for Tensor Parallel training.
Type of Change
[x] Bug fix (non-breaking change which fixes an issue)
[x] New feature (non-breaking change which adds functionality)
[x] Breaking change (fix or feature that would cause existing functionality to not work as expected)
[x] This change requires a documentation update
How Has This Been Tested?
(Describe the tests used to validate the changes. Please provide instructions for reproduction.)
BMTrain New Version Release v1.0.0
Issue Reference
Issue #174
Description
This new version of BMTrain introduces Tensor Parallelism and a significant restructuring of the codebase. We have enhanced flexibility by allowing for more granular control over ZeRO levels and Activation Checkpointing. This control is now achievable at the
bmt.Block
(alias of bmt.CheckpointBlock) level, enabling targeted optimizations. For instance, it is now possible to apply aggressive ZeRO settings or disable Checkpointing in specific layers as needed. Additionally, the updated BMTrain features a suite of operators designed explicitly for Tensor Parallel training.Type of Change
How Has This Been Tested?
(Describe the tests used to validate the changes. Please provide instructions for reproduction.)
Checklist
Additional Information
(Provide any additional information, configuration details, or data necessary for the review.)