After a finetuning with a config file, there is currently a confusing key in the output config file, under the model optimizer.
This is caused by LightningCLI, which automatically overrides the configure_optimizers method of the task and instead instantiates the optimizers and schedulers based on the keys optimizer and lr_scheduler at the root level of the config file.
The confusion occurs due to the default values for the optimizer in the tasks, which get output in the config file nonetheless. Thus, in the config file, we end up with two different optimizers, even though the one under model.optimizer is not used.
To remove this confusion, this PR replaces the default value in the constructor with None. Then, in configure_optimizers, if the value is None, it replaces it with Adam (the current default value).
After a finetuning with a config file, there is currently a confusing key in the output config file, under the model optimizer.
This is caused by LightningCLI, which automatically overrides the
configure_optimizers
method of the task and instead instantiates the optimizers and schedulers based on the keysoptimizer
andlr_scheduler
at the root level of the config file.The confusion occurs due to the default values for the optimizer in the tasks, which get output in the config file nonetheless. Thus, in the config file, we end up with two different optimizers, even though the one under
model.optimizer
is not used.To remove this confusion, this PR replaces the default value in the constructor with
None
. Then, inconfigure_optimizers
, if the value isNone
, it replaces it withAdam
(the current default value).