Closed sebffischer closed 1 year ago
@mllg @pat-s @be-marc
Currently our "default" is almost exclusively used for documenting the parameters (Marc said he programs on them somewhere)
Yes we extract them with get_defaults()
(https://github.com/mlr-org/paradox/blob/main/R/default_values.R) and add missing defaults with an s3 dispatch (https://github.com/mlr-org/mlr3learners/blob/a2dae0bfcdb0ef7ae8474dc6254437294946d7bc/R/LearnerClassifRanger.R#L153).
I think for example this issue could have been avoided if we distinguished "defaults" and "initial" values better
I think we are not quite precise with our usage of
default
. In general it can mean two things:I think we should be more precise here. If we claim to set "Custom mlr3 defaults" then I think these parameters should not only be initialized but always be returned by
$get_values()
when the parameter is not set. As is right now, we should refer to them as "initial values", because this is what they are.Also there is another thing that is a little bit problemtic:
Currently our "default" is almost exclusively used for documenting the parameters (Marc said he programs on them somewhere). However because some upstream packages have defaults like
sample.int(400)
it is not straightforward to document these defaults, because settingdefault = sample.int(400)
will set a random value as the default in the learners's parameter documentation. Therefore thedefault
field of a paramter should also allow for something likequote(sample.int(400))
.I stumbled upon this when trying to implement a method
set_defaults()
that sets and stores the default values of a learner, which is currently done informally usingparam_set$values = list(...)
in e.g. the learner's initialize function (For this reason they are not stored anywhere). This would be useful so a tuner could always evaluate an algoritihm using it's default parameters (in case those are selected in a smart way) without having to reconstruct the learner from scratch.