huggingface / accelerate

🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support
https://huggingface.co/docs/accelerate
Apache License 2.0
7.97k stars 970 forks source link

[MLU] update deepspeed-mlu dependency #3192

Open Andy666G opened 4 weeks ago

Andy666G commented 4 weeks ago

What does this PR do?

update deepspeed-mlu dependency related PRs:

  1. (https://github.com/huggingface/transformers/pull/34362)
  2. (https://github.com/microsoft/DeepSpeed/pull/6472)

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. @muellerzr

HuggingFaceDocBuilderDev commented 4 weeks ago

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.