-
Hi,
I played around with the vignette
```{r}
library("mlr3tuning")
learner = lrn("regr.keras", callbacks = list(cb_es(patience = 3)))
task = mlr_tasks$get("mtcars")
resampling = rsmp("holdo…
-
-
[nds_selection.R](https://github.com/mlr-org/mlr3hyperband/blob/master/R/nds_selection.R) returns the best subset of points by non-dominated sorting with hypervolume contribution for tie-breaking.
…
-
I can't get the rather basic example to run from the mlr3book for tuning: https://mlr3book.mlr-org.com/tuning.html
This output and error appears:
```r
INFO [17:43:03.125] Starting to optimize …
-
### Expected Behaviour
By setting the baselearner to be "bbs", the "surv.mboost" shall be able to automatically process with the factor type variables and model the corresponding non-linear relations…
-
we want each tuner to freely decide the resampling experiment itself. but the function has become rather complex including errorhandling.
i think it should have something like a subset of hashes t…
-
-
I tried the Hyperband tuner with my current workflow, and while all the other tuners work fine,
The Hyperband tuner gives me the error:
Fehler in private$.optimize(inst) : abstract
tuner = TunerH…
-
If I pass k > 1 measures to the tuning instance, it gets automatically treated as a multi-objective problem by hyperband with k objectives.
We eventually want to optimize w.r.t. l < k measures onl…
-
If hyperband terminates early, then these lines:
https://github.com/mlr-org/mlr3hyperband/blob/256b3566e3e5dc2b1f735be3e8511e9a56300e97/R/TunerHyperband.R#L494-L497
are never reached and `bmr$rr_dat…
mb706 updated
4 years ago