-
I tried the Hyperband tuner with my current workflow, and while all the other tuners work fine,
The Hyperband tuner gives me the error:
Fehler in private$.optimize(inst) : abstract
tuner = TunerH…
-
If I pass k > 1 measures to the tuning instance, it gets automatically treated as a multi-objective problem by hyperband with k objectives.
We eventually want to optimize w.r.t. l < k measures onl…
-
If hyperband terminates early, then these lines:
https://github.com/mlr-org/mlr3hyperband/blob/256b3566e3e5dc2b1f735be3e8511e9a56300e97/R/TunerHyperband.R#L494-L497
are never reached and `bmr$rr_dat…
mb706 updated
4 years ago
-
```
library(bbotk)
source("tests/testthat/helper.R")
terminator = term("evals", n_evals = 10)
inst = MAKE_INST_2D_2D(terminator)
opt = OptimizerRandomSearch$new()
opt$optimize(inst)
```
Th…
-
Dear mlr-org Team,
this is a rather weird bug, the subsequent code works if one does not attach **mlr3spatiotempcv** or if one runs the code sequentially, i.e., setting `workers = 1` in the future pl…
-
When https://github.com/mlr-org/paradox/issues/268 is fixed there ought to be a dedicated function that does that.
mb706 updated
4 years ago
-
Hi,
I might have misunderstood something, but I am running into issues with non-linear tuning grids. I am trying to define a grid with unequal spacing, e.g. the equivalent of `exp(seq(log(1), log(5…
-
-
We could remove `mlr_terminators` and `mlr_optimizers` to have a more basic package with no references to mlr3. The dictionaries are more useful in `mlr3tuning` as `mlr_terminators` and `mlr_tuners`.
-
hmmm. this should be done in mlr3 tuning right?