Make it possible to user learning rate schedulers for training MM PyTorch models
Implementation Details :construction:
Currently it is possible to provide an Optimizer class in the PyTorch model constructor, which is instantiated within the standard PyT Lightning method configure_optimizers() with the model.parameters().
The learning rate scheduler constructor needs the optimizer instance, as well as other required arguments, so it cannot be provided as constructor argument if the optimizer instance is created within configure_optimizers() method.
This PR makes it possible to provide Optimizer and Scheduler instances via properties, so that the user has more flexibility to instantiate those objects, including cases where different optimizers are used for different parameters.
Testing Details :mag:
Added the test_init_optimizer_and_scheduler_instances_via_property
Goals :soccer:
Implementation Details :construction:
configure_optimizers()
with themodel.parameters()
.configure_optimizers()
method.Testing Details :mag:
test_init_optimizer_and_scheduler_instances_via_property