Open sebffischer opened 5 months ago
Should be fixed by https://github.com/mlr-org/mlr3/pull/997
Hmm, now there is just a different error message that still does not make it clear that this is not a user error but a bug in mlr3?
So it's not really a bug. There is no default value for nrounds in the ParamSet
of XGBoost so defaut_values()
can't find one either. I fixed this a few days ago in default_values.LearnerXGBoost
but it will not work for GraphLearner
.
So what should happen now, when the user specifies evaluate_default = TRUE
and tunes xgboost in a graphlearner / normal learner?
You will get the error message from mlr3 when you use a graph learner.
We offer a flag evaluate_default
, but you get an error when you enable it?
That is really not user friendly imho.
I think we should implement a default_values.GraphLearner
method that looks something like
default_values.GraphLearner = function(x) {
vals = list()
for (obj in x$graphs) {
if (test_class(obj, "PipeOpLearner")) {
new_vals = default_values(obj$learner)
new_vals = set_names(new_vals, paste0(obj$id, ".", names(new_vals))
vals = c(vals, new_vals)
}
}
return(vals)
}
see https://stackoverflow.com/questions/77855462/error-when-performing-hpo-with-hyperband-and-mbo-with-helper-function-auto-tuner