-
> Actually it would be probably a good design to set the objective function upon creation of the OptimInstance. So objective_function(x) would call the private .objective_function(self, x). This priva…
-
Error occurs in last line of executing the following
```{r 03-optimization-hyperband-004}
library(mlr3hyperband)
library(mlr3pipelines)
library(mlr3tuning)
library(mlr3)
library(paradox)
set.…
-
Hi,
I stumbled on this by chance, not sure if it can be classified as bug but I thought you should know about it.
When combined with a learner that fails occasionally stagnation terminator retu…
-
not sure how to handle this:
the package should allow for the following:
if multiple points are evaluated, this should be parallelized (by future) and encapsulated (by callr).
now it seems reaso…
-
it really not good for debugging and so on to do this internally
the logger can be auto-configured from the outside to achieve the same
-
xss_trafoed should be added as a list of lists to the `Archive` in `add_evals` even when just one parameter is optimized.
-
in evaluator? also make sure they dont happen twice if we connect this to mlr3tuning!
-
- tuning instance
- mbo target functions vs. `TuningInstanceMath`
- split archive and target fun
- terminators
Ergo:
- mlr3optools package?
- parallel versions?
-
@mb706 already has something here.
https://github.com/mb706/okmbo