cog-imperial / entmoot

Multiobjective black-box optimization using gradient-boosted trees
https://entmoot.readthedocs.io/
BSD 3-Clause "New" or "Revised" License
56 stars 12 forks source link

lightgbm model hyperparameter tuning #19

Closed R-M-Lee closed 11 months ago

R-M-Lee commented 1 year ago

@spiralulam What do you think about giving Enting.fit a kwargs parameter to allow the user to set the lightgbm hyperparameters? I want to do something like cross-validation to get a better surrogate rather than accept the default.

Then we can either leave the choice of hyperparameters to the user or provide a simple function to optimize the hyperparams (I would propose random sampling in this case to keep things relatively simple).

R-M-Lee commented 1 year ago

update: ok I noticed that Enting takes a params dict and the hyperparameters are specified here:

params["tree_train_params"]["train_params"]

More generally, have we documented what can be in the params dict? I see unc_params and tree_training_params from going through the code but I'm not sure I would find this in the docs

spiralulam commented 1 year ago

@R-M-Lee Yes, we need to document the params dict. Will do.

spiralulam commented 11 months ago

You can now check all possible parameters using the new class structure implemented by Toby.