mlr-org / mlr3tuning

Hyperparameter optimization package of the mlr3 ecosystem
https://mlr3tuning.mlr-org.com/
GNU Lesser General Public License v3.0
52 stars 5 forks source link

evaluate_default breaks with graph learner #406

Open sebffischer opened 5 months ago

sebffischer commented 5 months ago

see https://stackoverflow.com/questions/77855462/error-when-performing-hpo-with-hyperband-and-mbo-with-helper-function-auto-tuner

be-marc commented 5 months ago

Should be fixed by https://github.com/mlr-org/mlr3/pull/997

sebffischer commented 5 months ago

Hmm, now there is just a different error message that still does not make it clear that this is not a user error but a bug in mlr3?

be-marc commented 5 months ago

So it's not really a bug. There is no default value for nrounds in the ParamSet of XGBoost so defaut_values() can't find one either. I fixed this a few days ago in default_values.LearnerXGBoost but it will not work for GraphLearner.

sebffischer commented 5 months ago

So what should happen now, when the user specifies evaluate_default = TRUE and tunes xgboost in a graphlearner / normal learner?

be-marc commented 5 months ago

You will get the error message from mlr3 when you use a graph learner.

sebffischer commented 5 months ago

We offer a flag evaluate_default, but you get an error when you enable it? That is really not user friendly imho.

sebffischer commented 5 months ago

I think we should implement a default_values.GraphLearner method that looks something like


default_values.GraphLearner = function(x) {
  vals = list()
  for (obj in x$graphs) {
      if (test_class(obj, "PipeOpLearner")) {
        new_vals = default_values(obj$learner)
        new_vals = set_names(new_vals, paste0(obj$id, ".", names(new_vals))
        vals = c(vals, new_vals)
      }
    }
    return(vals)
}