mlr-org / mlr3tuning

Hyperparameter optimization package of the mlr3 ecosystem
https://mlr3tuning.mlr-org.com/
GNU Lesser General Public License v3.0
54 stars 5 forks source link

Add nloptr #234

Closed pfistfl closed 4 years ago

pfistfl commented 4 years ago

As I want to use it for threshold tuning I added nloptr. It basically allows for nonlinear optmization + equality / inequality constraints. Currently we can only use parts of nloptr as we usually have no gradient information available, I guess in order to extend to this, we should decide how this would look like.

Currently the "TunerNloptr with int params and trafo" unit tests fail, but this does not seem to be on my side.

mllg commented 4 years ago

Out of curiosity: is nloptr really so much better suited for threshold tuning than optim()?

pfistfl commented 4 years ago

I do not know how this is for threshold tuning, empirically it at least seems to hold for ensemble weights (page 47).

pfistfl commented 4 years ago

What is the status here? I think you wanted to look at this @be-marc @jakob-r or is any action from my side required?

jakob-r commented 4 years ago

I guess either @be-marc or I have to adapt that to all the changes after #248 is merged.

be-marc commented 4 years ago

@pfistfl Sorry I missed that you already create a PR. I used the code you posted here for the implementation. The only difference is that I sticked more to the package defaults. So x0 is not create with a random design and algorithm is also a required parameter. You can find the code in bbotk::OptimizerNLoptr and mlr3tuning::TunerNLoptr. If you want to improve something, open a new issue or PR.