Open iiternalfire opened 4 years ago
Thanks for the minimal code and stack trace.
I suspect this will not be satisfying, but this seems like a feature request that is motivated by a mis-specification of the optimization problem. In other words, you can indeed change the constraints dynamically if you insist by e.g. updating bo.context : ContextManager
(GPyOpt/optimization/acquisition_optimizer.py has class ContextManager
) - but - that's an anti-pattern.
To move the issue forward, concretely, would you mind sharing the motivating problem for using fun(x)-6
in the domain?
Hi @ekalosak , Sorry for delay in my response.
The concrete objective is to implement Bayesian optimization with unknown constraints [1], where analytical structure of the constraint is not known to us, but it can be evaluated point-wise by that function "fun(x)" in my example above. For example, if I am trying to do hyper-parameter optimization to get best accuracy under constraint that inference time on each instance is less that 60ms, then function fun uses timers to evaluate 95% upper confidence bound on inference time and then constraint is "fun(x) - 60 <= 0". See [1] for other examples.
[1] Gelbart, M.A., Snoek, J. and Adams, R.P., 2014. Bayesian optimization with unknown constraints. arXiv preprint arXiv:1403.5607.
Meta This is clearly a valuable addition to GPyOpt - thanks for bringing it up. It's a substantial feature addition. If you bring up a draft branch, I'd bet some other contributors would help with some parts. Linus's Law in action >:)
Implementation It's one thing to explicitly dis-allow portions of the design space. It's another to model the constraints as a conjugate Gaussian process in the acquisition function (Section 3.1 Eqn (7) in Gelbart et al. 2014) with arbitrary function calls. In this case, implementation-wise, it appears to be:
self.Y_new, cost_new, constraints_violated = self.objective.evaluate(self.suggested_sample)
note the constraints_violated
addition to the return signature) - or a list/dict of constraint functions should they be decoupledg_k(x)
constraint functions (Section 2.2)run_optimization
loop to accommodate the previous additions e.g. how and when to evaluate g_k(x)
, when to optimize constraint surrogate model (g_k
) hyperparameters, I'm sure there are other things I'm leaving out.Do those requirements look right? As for an MVP (minimally viable product), the feature branch needs:
external_constraints: Dict[str, Callable[List[...], Union[bool,float]]] = None
optional kwarg in core/bo.pyexternal_constraints
in the ModularBayesianOptimization class for top-level user-facing access.run_optimization
function (i.e. do an if self.external_constraints is not None: self.check_external_constraints(...)
)Notes
Following the evaluation stack down, it looks like a lot of this can go through the acquisition_optimizer
's optimize()
(here).
Any update concerning this feature? Was also interested in.
Are there any updates regarding this from the community? I am in need of a constraint that is calculated via a function defined in my workspace, and currently unable to use it that way.
Just a quick note here - GPyOpt is effectively archived and isn't developed anymore. We only haven't closed the repo to keep the issues open for some discussion. So I am afraid the only way to have this feature added is for someone to fork and develop it themselves!
The current implementation of "constraints" is very restricted as one cannot call a function defined in the workspace to be evaluated, thus general black-box constraints, output constraints, etc. cannot be realized in current GPyOpt. Consider the example below and see the concerns.
Minimal Code
Error trace:
However,
fun
was defined just before being used to add an output constraint. In this case, one could have repeatedfun
script again in constraint, but this might not always be possible.Concerns
1) Can you update exec('constraint = lambda x:' + d['constraint'], globals()) statement and constraint compiling codes to allow functions that are defined locally within the workspace? 2) Can you allow other blackbox constraints to be added just the way you allow external evaluation of objectives?