aditya-grover / climate-learn

Source code for ClimateLearn
MIT License
310 stars 49 forks source link

Support for arbitrary optimizer #53

Closed prakhar6sharma closed 1 year ago

prakhar6sharma commented 1 year ago

Is your feature request related to a problem? Please describe. Not exactly a problem, but it would be nice to have support for optimizers other than just Adam and AdamW.

Describe the solution you'd like Give user the flexibility to choose his/her own choice of optimizer as long as it is inherited from torch.optim.Optimizer.

prakhar6sharma commented 1 year ago

I am working on drafting a PR for this. Will ask for the review once I am done.