TorchEnsemble-Community / Ensemble-Pytorch

A unified ensemble framework for PyTorch to improve the performance and robustness of your deep learning model.
https://ensemble-pytorch.readthedocs.io
BSD 3-Clause "New" or "Revised" License
1.09k stars 95 forks source link

Add custom optimizers #113

Closed Arcify closed 2 years ago

Arcify commented 2 years ago

Currently, the ensemble methods allow for different existing torch optimizers. It would be nice if users could choose to use a custom optimizer instead of providing a string with the optimizer name.

xuyxu commented 2 years ago

Hi @Arcify, to achieve this, is it enough to accept a custom optimizer object in the set_optimizer method?

Arcify commented 2 years ago

Hi @xuyxu , yes, that would be perfect and exactly what I need.

xuyxu commented 2 years ago

Since I am not quite familiar with the custom optimizer, a PR would be much appreciated ;-)

If you are willing to, feel free to comment below if you need any help.

Arcify commented 2 years ago

I just tried to fix it, but I came to the conclusion that there is no way to fix it that generalizes over other custom optimizers as well (given that not all optimizers have optimizer.zero_grad and some other small things), therefore I think adding custom optimizers will not work out the way I hoped it would, and I will close this issue. Thanks for your time!

xuyxu commented 2 years ago

Never mind, thanks for your great idea ;-)