Closed jakob-r closed 3 years ago
Second version is implemented.
my_at = create_autotuner(learner_list = c("classif.ranger"))
You can add more than one learner, resulting in a (probably too complex?) GraphLearner
in my_at$learner
The created learner currently comes with a number of preprocessing PipeOps. I will adjust this in the following week when I refactor the preprocessing. A few questions for your usecase:
We might have to discuss that in a bigger round. I think the first idea was to be close to the API of mlr3tuning::AutoTuner
with the only difference that a ParamSet
is optional. I can see that it would also be practical to not have to worry about the Tuner. However, preprocessing should be optional. Do you internally always have to create a GraphLearner to combine subsampling with hyperband?
For mlr3automl
we always use a GraphLearner
, because preprocessing is critical for the stability of the system. But I will rework the preprocessing in the next week anyway, so I will include a version without preprocessing for export_autotuner
.
I will ping you when it is ready
export_autotuner
was refactored in #14
It now has the same interface as the AutoTuner
class, but gives defaults for all parameters.
Additionally, it works with Hyperband tuning by creating a GraphLearner
with an additional PipeOpSubsample
Either
AutoMLTuner
specializesmlr3tuning::AutoTuner
andAutoMLTuner$new(lrn("classif.svm"))
creates the desiredAutoTuner
orcreate_auto_tuner(lrn("classif.svm"))
creates anmlr3tuning::AutoTuner
object.