a-hanf / mlr3automl

Automated machine learning in mlr3
GNU Lesser General Public License v3.0
25 stars 4 forks source link

AutoMLTuner #9

Closed jakob-r closed 3 years ago

jakob-r commented 3 years ago

Either AutoMLTuner specializes mlr3tuning::AutoTuner and AutoMLTuner$new(lrn("classif.svm")) creates the desired AutoTuner or create_auto_tuner(lrn("classif.svm")) creates an mlr3tuning::AutoTuner object.

a-hanf commented 3 years ago

Second version is implemented.

my_at = create_autotuner(learner_list = c("classif.ranger"))

You can add more than one learner, resulting in a (probably too complex?) GraphLearner in my_at$learner

The created learner currently comes with a number of preprocessing PipeOps. I will adjust this in the following week when I refactor the preprocessing. A few questions for your usecase:

  1. Is Hyperband as a default tuner suitable?
  2. Is it ever useful for you to have preprocessing in this GraphLearner? Or would you always want the learner and nothing else?
jakob-r commented 3 years ago

We might have to discuss that in a bigger round. I think the first idea was to be close to the API of mlr3tuning::AutoTuner with the only difference that a ParamSet is optional. I can see that it would also be practical to not have to worry about the Tuner. However, preprocessing should be optional. Do you internally always have to create a GraphLearner to combine subsampling with hyperband?

a-hanf commented 3 years ago

For mlr3automl we always use a GraphLearner, because preprocessing is critical for the stability of the system. But I will rework the preprocessing in the next week anyway, so I will include a version without preprocessing for export_autotuner.

I will ping you when it is ready

a-hanf commented 3 years ago

export_autotuner was refactored in #14

It now has the same interface as the AutoTuner class, but gives defaults for all parameters. Additionally, it works with Hyperband tuning by creating a GraphLearner with an additional PipeOpSubsample