huggingface / accelerate

🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support
https://huggingface.co/docs/accelerate
Apache License 2.0
7.97k stars 970 forks source link

update Megatron-LM plugin code to version 0.8.0 or higher. #3174

Closed eljandoubi closed 4 weeks ago

eljandoubi commented 1 month ago

What does this PR do?

I have adapted the Megatron-LM plugin code to version 0.8.0 or higher. Same setup as before, but now Megatron-LM is using core_r0.8.0.

Who can review?

HuggingFaceDocBuilderDev commented 4 weeks ago

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

eljandoubi commented 4 weeks ago

No! In fact, functions and classes of Megatron-LM change their location from version to another.

muellerzr commented 4 weeks ago

(Failing tests are unrelated, python update)

eljandoubi commented 4 weeks ago

Megatron-LM 0.9.0 is out. I believe the plugin is still functional. Can you confirm?