huawei-noah / HEBO

Bayesian optimisation & Reinforcement Learning library developed by Huawei Noah's Ark Lab
3.28k stars 590 forks source link

Support for fixed parameters #29

Closed bbudescu closed 1 year ago

bbudescu commented 1 year ago

Is it possible to have HEBO search the optimum over a design space where a parameter is defined in a regular fashion (e.g., as a real or integer), but also temporarily (i.e., during a single optimization session) to constrain the search space to a single value of that parameter, and to search only the subspace defined by the remaining unconstrained parameters?

I.e., something similar to Optuna's PartialFixedSampler

Why this is useful

Context

Generally, it is useful to be able to run several optimization sessions, and use results of trials from previous sessions to improve convergence in the current session. As far as I can understand from the docs, using HEBO, one can run a bunch of trials in one or more sessions and then use observe api calls to feed the previous results to the bayesian model before further exploring using suggest in the current session.

Scenario no. 1 - Accelerate optimization by reducing the search space

Now, as explained in the Optuna docs linked above and in the related issue, after running a bunch of trials, one might decide upon a particular value of a parameter, and want to prevent the optimizer from investing more time on exploring values of that parameter are clear not to yield better results, but taking into account the results obtained with other values of that parameter (along the dimensions of other unbound params, the trained cost predictor might still provide valuable insights).

Scenario no. 2 - Transfer Learning

If this is implemented, one could do transfer learning, in a fashion similar to the one proposed, e.g., by SMAC3 (they call this feature 'Optimization across Instances') or OpenBox, i.e., to reuse knowledge about high yield sub-regions from one instance (dataset/task) to another.

Potential Solution

I'm thinking that one could simply change the bounds of a particular parameter every time a new optimization session is started. However, I'm not quite sure that the observe call will accept values that are out of bounds, and, even if it doesn't crash, that the underlying bayesian model is trained correctly.

AntGro commented 1 year ago

Actually suggest method in HEBO accepts as input a dictionary fix_input (if fix_input[param_name] = v it means that the suggested points should have parameter param_name fixed to v). See (HEBO code)[https://github.com/huawei-noah/HEBO/blob/f050865fd2f554b5ca94642667257b365c753f29/HEBO/hebo/optimizers/hebo.py#L117].