Open dpaetzel opened 10 months ago
I'm currently looking into this. I think one could (and should?) reuse MLJBase.recursive_setproperty!
but then the Choice
(and other HP
) constructor still would have to accept an ::Expr
which it does not right now. Would you be interested in a PR?
I think it may be possible to solve this by simply replacing Symbol
with Union{Symbol, Expr}
?
However, it would probably make sense to extract the hyperparameter label type (which is currently hardcoded to Symbol
) to a definition such as
Label = Union{Symbol,Expr}
which can then be used in all the Dict
s etc.
However, I'm not vastly familiar with the intricacies of Tree-structured Parzen Estimation (and neither with this implementation) and am a little bit worried that this would break stuff. Maybe a person more familiar with the code can comment? :slightly_smiling_face:
Hi @dpaetzel thank you for raising this issue. I've checked your examples and the issue is not happening because of update_params!
not supporting nested hyperparams but as you correctly observed the space with hyperparams to be optimised should be a dictionary where the keys (params) are Symbols and not Expressions or Symbols of those expressions. Given your example, the model param should be :max_depth
rather than model.max_depth
or Symbol("model.max_depth")
.
Based on your MWE, you created the pipeline object containing both the standardizer and the model and you passed that pipeline object to the model
entry in the TunedModel() wrapper. This means that accessing the model params would then indeed require to be pipe.model.param but if instead you provided model=pipe.model
then your space can be defined as below:
function usingexpr()
space = Dict(
:max_depth => HP.Choice(:max_depth, collect(1:10))
)
pipe_tuned = TunedModel(
model = pipe.model,
tuning = MLJTreeParzenTuning(),
ranges = space,
measure = mae,
)
mach_tuned = machine(pipe_tuned, X, y)
fit!(mach_tuned)
end
Would the above work for you?
If you’re interested in PR, then may be parsing the expression and extracting the hyperparm name to be passed to TreeParzen could be an option but this could be tricky if there were further changes to the MLJ's dot syntax.
Describe the bug
I want to optimize the hyperparameters of a simple pipeline. According to the MLJ docs, this should be possible by accessing these hyperparemeters using
:(dot.syntax.like.this)
(which works, e.g., forGrid
tuning). However, this does not work withMLJTreeParzenTuning
.To Reproduce
An MWE is
Expected behavior
I expected to be able to tune nested hyperparameters just like when using the
Grid
strategy.Environment (please complete the following information):
github:dpaetzel/nixpkgs/dpaetzel/nixos-config
.Additional context
Thank you for making and maintaining this library! :slightly_smiling_face:
It looks like the problem is caused by the
update_param!
function inMLJTreeParzen.jl
not respecting nested hyperparameters.