Open clearhanhui opened 1 year ago
Thanks. Would you like to create a PR?
Thanks. Would you like to create a PR?
One way I worked around this bug is to ensure that space
/param_space
only contains hyperparameters defined with tune search space and remove any constants. I am using Ray Train TorchTrainer, so I moved the constants there instead.
If you are using the function trainable API, consider splitting out constants from config
into separate arguments and use tune.with_parameters()
.
I believe this bug happens when trying to merge the constants with the sampled hyperparameters in config
.
One way I worked around this bug is to ensure that
space
/param_space
only contains hyperparameters defined with tune search space and remove any constants. I am using Ray Train TorchTrainer, so I moved the constants there instead.If you are using the function trainable API, consider splitting out constants from
config
into separate arguments and usetune.with_parameters()
.I believe this bug happens when trying to merge the constants with the sampled hyperparameters in
config
.
Thank you @yxtay.
Bug details
I am running HPO for XGBoost with Ray and Bendsearch. At flaml/tune/searcher/search_thread.py#L66, in my case, the
config
isand the
self._const
isafter update step, I will get
Values in
config['params']
sampled from search space are all dropped.How to solve
I solved it by recursively update
config
. Here is an example:Then just replace
config.update(self._const)
withrecursive_update(config, self._const)
, then i can get:My Traceback