which appears when find_unused_parameters=True but it's not necessary.
Motivation
We set find_unused_parameters=True by default, so this warning will appear for a large percentage of users.
Pitch
Catch this warning and produce another one that explicitly mentions how to modify the strategy passed.
If you enjoy Lightning, check out our other projects! âš¡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
🚀 Feature
Latter PyTorch versions introduced the following warning
https://github.com/pytorch/pytorch/blob/d0adb5ff264df8e0e057ce1178feb30198c601d2/torch/csrc/distributed/c10d/reducer.cpp#L1256-L1264
which appears when
find_unused_parameters=True
but it's not necessary.Motivation
We set
find_unused_parameters=True
by default, so this warning will appear for a large percentage of users.Pitch
Catch this warning and produce another one that explicitly mentions how to modify the strategy passed.
If you enjoy Lightning, check out our other projects! âš¡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
cc @borda @akihironitta