microsoft / DeepSpeed

DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
https://www.deepspeed.ai/
Apache License 2.0
33.63k stars 3.95k forks source link

Hybrid Offloading for ZeRO3 #5625

Open tohtana opened 3 weeks ago

tohtana commented 3 weeks ago

NOTE: This feature works only for forward pass.

This feature allows users to gather ZeRO3-partitioned params and offload only a part of them to host memory. The offloaded parameters are loaded to device memory in pre-forward hook and offloaded back to host memory in post-forward hook.

You can reduce all-gather's in loop.

      with deepspeed.zero.ZeRO3HybridOffload(model, param_threshold=1e9):
          for x in dataset:
              output = model(x)

Generation using auto-regressive models is a good example of where this feature can be useful. In this example, ZeRO3 usually doesn't work because different ranks may produce different lengths. Allgather gets stuck after one of the ranks finishes generation.

      with deepspeed.zero.ZeRO3HybridOffload(model, param_threshold=1e9):
          output = model.generate(input_ids)