Closed carmocca closed 2 years ago
@carmocca It should, because the optimizer state gets consolidated from all ranks before saving. https://github.com/Lightning-AI/lightning/blob/e0c2c3e677d141594cdd799050942b10908c9a97/src/pytorch_lightning/strategies/strategy.py#L176-L183
Adding a test (if we haven't already) can't hurt though.
š Feature
Does this work?
Motivation
We had a legacy test skipped in our CI mentioning that this is unsupported.
Pitch
We should check, and raise an appropriate error if that's the case.
Additional context
https://github.com/Lightning-AI/lightning/pull/14476/files/38e10ba837dad423ecaa52f300609083de379e19#r960749044
If you enjoy Lightning, check out our other projects! ā”
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging PyTorch Lightning, Transformers, and Hydra.
cc @tchaton @rohitgr7 @borda @akihironitta @awaelchli