Is your feature request related to a problem? Please describe.
Currently it's not possible to select a specific optimizer from the yaml file. By default SGD is selected and optimizers key/values in the config files correspond to its parameters.
However, as witness with the GLC24_pre_extracted multimodal ensemble model, the optimizer needed to be changed for AdamW in the experiment's datamodule code
Describe the solution you'd like
Change the optimizer section of all config files to allow selecting an optimizer based on its name + its parameters in a nested dictionary.
Modify malpolon.models.standard_prediction_systems.ClassificationSystem so that the class takes an additional argument optimizer of type dictionary, forwarded from the config file, and instantiates it.
I recommend following the same strategy as for metrics: a small amount of well known optimizers should be callable with simple names as keys of an optimizer.name entry in the config file. For more sophisticated optimizers, users should register the full PyTorch optimizer module name.
-> In malpolon.models.utils: build a OPTIMIZERS_CALLABLES dictionary, and update check_optimizer in a similar fashion as check_metric
Is your feature request related to a problem? Please describe. Currently it's not possible to select a specific optimizer from the yaml file. By default SGD is selected and optimizers key/values in the config files correspond to its parameters.
However, as witness with the GLC24_pre_extracted multimodal ensemble model, the optimizer needed to be changed for AdamW in the experiment's datamodule code
Describe the solution you'd like
optimizer
section of all config files to allow selecting an optimizer based on its name + its parameters in a nested dictionary.malpolon.models.standard_prediction_systems.ClassificationSystem
so that the class takes an additional argumentoptimizer
of type dictionary, forwarded from the config file, and instantiates it. I recommend following the same strategy as for metrics: a small amount of well known optimizers should be callable with simple names as keys of anoptimizer.name
entry in the config file. For more sophisticated optimizers, users should register the full PyTorch optimizer module name. -> Inmalpolon.models.utils
: build aOPTIMIZERS_CALLABLES
dictionary, and updatecheck_optimizer
in a similar fashion ascheck_metric