OptimalFoundation / nadir

Nadir: Cutting-edge PyTorch optimizers for simplicity & composability! 🔥🚀💻
https://nadir.rtfd.io
Apache License 2.0
13 stars 3 forks source link

Refactor Optimizer Config API #45

Open bhavnicksm opened 6 months ago

bhavnicksm commented 6 months ago

Currently every optimizer comes with a config specifically for that optimizer that manages the hyperparameters for the optimizer.

This is made because of the following reasons:

This diverges from the PyTorch-esque way optimizers are where the classes need to be provided with full information on all the parameters.

This issue is to find a decent middle ground between the two -> Allow for immediate tiny changes in the Hyperparameters on the spot, with the object, while also allowing for inheritable parameters.

This adds a significant overhead and makes things a bit more complex, so it should ideally be abstracted out in the BaseConfig and BaseOptimizer, with all the utility functions that transform from one format to the other.

bhavnicksm commented 5 months ago

The reason why Config classes were originally set was because I expected all the optimizers to have a lot of "features" or options to tune in the future which makes the constructors really long.

Imagine the complexity if each new optimizer has to mention 30+ arguments in their init signature. Sounds tough!

bhavnicksm commented 5 months ago

This issue is related to #62