Lightning-AI / pytorch-lightning

Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.
https://lightning.ai
Apache License 2.0
28.47k stars 3.39k forks source link

Deprecate passing a container to `self.save_hyperparameters()` #10375

Open awaelchli opened 3 years ago

awaelchli commented 3 years ago

🚀 Deprecation

As the title says, consider to deprecate passing a container (dict, DictConfig, etc.) to the LightningModule.save_hyperparameters.

Motivation

See #10280 for a bit more context. There exists a confusion of how to save hyperparameters in a LightningModule or DataModule when the argument passed to the constructor is a container. We see this pattern:


# A) 
class Model(LightningModule):
    def __init__(self, hparams):
        super().__init__()
        self.save_hyperparameters(hparams)

when it really should just be:


# B) 
class Model(LightningModule):
    def __init__(self, hparams):
        super().__init__()
        self.save_hyperparameters()
        # or
        self.save_hyperparameters("hparams")

Both ways work just fine until you try to load the model from a checkpoint:


# Fails with A), works with B)
model = Model.load_from_checkpoint("path")

The reason this fails is that Model.load_from_checkpiont will take the container saved in the checkpoint and pass the arguments in as Model(**kwargs). But the signature of the model does not match in A) because the model is supposed to receive the container as input, not its contents.

Q: Why do A) and B) both exist? A: A) is an artifact of the old way of saving hyperparameters, back in 1.0 the recommended way was to have a container like this and assign it as self.hparams = hparams. But save_hyperparameters was introduced to generalize this while trying to be as backward compatible as possible. Lots of feet have been shot.

Q: Why can't we just special-case things so it works both ways? A: One can special-case this example here but it's easy to find another example where things break down (just add more args). This has happened in the past. The new (B) way of saving hyperparams is the recomended way because IT JUST WORKS.

Pitch

Deprecate passing a full container to the save_hyperparameters() method. A warning will be shown and give simple instructions to the user how they can change their code in one line.

Alternatives

Still support it. Document it only in the API signature but not in our user guides.

Additional context

The docs are being improved over here already: #10280 Closes also #8948

cc @borda @tchaton @rohitgr7 @akihironitta @carmocca


If you enjoy Lightning, check out our other projects! âš¡

tchaton commented 3 years ago

Sounds good to me ! Let's do it !

timurlenk07 commented 2 years ago

Hi! I have a use case that would be impacted by this change. I currently use this save_hyperparameters function save the hyperparameters of training and augmentation to Tensorboard. As augmentation is done outside of the training process, I do not pass the augmentation hyperparams directly to the LightningModule, instead I pass all the hyperparams (generated by an outside engine Ray Tune) as a single dict and save that.

awaelchli commented 2 years ago

Hi Once the change lands, a warning will be shown to user with your use case. It is recommended to change your code to:

Option 1)

Change your code to:

class Model(LightningModule):
    def __init__(self, hparams):
        super().__init__()
        self.save_hyperparameters()
        # or
        self.save_hyperparameters("hparams")

Option 2)

Change your code to:

 class Model(LightningModule):
    def __init__(self, **hparams):
        super().__init__()
        self.save_hyperparameters()

and change your model init to model = Model(**myparams)

A warning of this kind will be shown to all users who call save_hyperparameters(hparams) on a container directly.

awaelchli commented 2 years ago

@Borda @carmocca I suggest moving forward with this in 1.8. The PR can be updated and finished within a short time https://github.com/Lightning-AI/lightning/pull/10608