🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support
This PR updates the remove_hook_from_module function to also remove the warning hooks that were added during the dispatch function. I've added a small test to see if we indeed do not get the warning anymore
Fixes https://github.com/huggingface/accelerate/issues/2840
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
What does this PR do?
This PR updates the
remove_hook_from_module
function to also remove the warning hooks that were added during the dispatch function. I've added a small test to see if we indeed do not get the warning anymore Fixes https://github.com/huggingface/accelerate/issues/2840