Closed Arcify closed 2 years ago
Hi @Arcify, to achieve this, is it enough to accept a custom optimizer object in the set_optimizer
method?
Hi @xuyxu , yes, that would be perfect and exactly what I need.
Since I am not quite familiar with the custom optimizer, a PR would be much appreciated ;-)
If you are willing to, feel free to comment below if you need any help.
I just tried to fix it, but I came to the conclusion that there is no way to fix it that generalizes over other custom optimizers as well (given that not all optimizers have optimizer.zero_grad and some other small things), therefore I think adding custom optimizers will not work out the way I hoped it would, and I will close this issue. Thanks for your time!
Never mind, thanks for your great idea ;-)
Currently, the ensemble methods allow for different existing torch optimizers. It would be nice if users could choose to use a custom optimizer instead of providing a string with the optimizer name.