Closed pfistfl closed 1 year ago
in generel yes, we can do this. would it be enough to have this after batch_eval? and i would like to keep this separate from terminators
Really depends on which use-cases we want to cover. I guess we should collect a number of situations where we would want this
@mb706
Really depends on which use-cases we want to cover.
please outline 1-3 most important ones here. you kinda started in the OP already with that
@pfistfl https://github.com/mlr-org/bbotk/pull/110 This might cover some of your use-cases. Suggestions are welcome.
I answered in the PR.
Callbacks are implemented in bbotk and mlr3tuning now.
Creating this issue, so we can discuss whether this makes sense.
pytorch, keras etc. make ample use of functional hooks/callbacks.
The same principle is used in Keras as Callbacks. In
keras
you can for example define a function that is run after / before every batch / epoch.I think this would nicely fit into
mlr3tuning
.Possible use-cases:
nrounds* < nrounds
model was trained with.hyperparam
of the tuner when a certain criterion is met (forinfillCritCB
):cb.lambda
after $n$ iterations.cb.lambda
when validation error stagnatesThis could also perform as a
Terminator
, as such a hook could basically just set some termination flag.