bayesian-optimization / BayesianOptimization

A Python implementation of global optimization with gaussian processes.
https://bayesian-optimization.github.io/BayesianOptimization/index.html
MIT License
7.84k stars 1.54k forks source link

Crashing when function to optimize is constant #255

Closed Golbstein closed 3 years ago

Golbstein commented 4 years ago

When my black_box_function is a cnstant, the optimization is crashing

def black_box_function(x, y):
    return 2

pbounds = {'x': (2, 4), 'y': (-3, 3)}

optimizer = BayesianOptimization(f=black_box_function, pbounds=pbounds, random_state=1)

optimizer.maximize(
    init_points=2,
    n_iter=1,
)
StopIteration                             Traceback (most recent call last)
c:\projects\buildots\pycode\venv\lib\site-packages\bayes_opt\bayesian_optimization.py in maximize(self, init_points, n_iter, acq, kappa, xi, **gp_params)
    168             try:
--> 169                 x_probe = next(self._queue)
    170             except StopIteration:

c:\projects\buildots\pycode\venv\lib\site-packages\bayes_opt\bayesian_optimization.py in __next__(self)
     25         if self.empty:
---> 26             raise StopIteration("Queue is empty, no more objects to retrieve.")
     27         obj = self._queue[0]

StopIteration: Queue is empty, no more objects to retrieve.

During handling of the above exception, another exception occurred:

ValueError                                Traceback (most recent call last)
<ipython-input-13-9b4a4e7226f3> in <module>
     10 optimizer.maximize(
     11     init_points=2,
---> 12     n_iter=1,
     13 )

c:\projects\buildots\pycode\venv\lib\site-packages\bayes_opt\bayesian_optimization.py in maximize(self, init_points, n_iter, acq, kappa, xi, **gp_params)
    169                 x_probe = next(self._queue)
    170             except StopIteration:
--> 171                 x_probe = self.suggest(util)
    172                 iteration += 1
    173 

c:\projects\buildots\pycode\venv\lib\site-packages\bayes_opt\bayesian_optimization.py in suggest(self, utility_function)
    122         with warnings.catch_warnings():
    123             warnings.simplefilter("ignore")
--> 124             self._gp.fit(self._space.params, self._space.target)
    125 
    126         # Finding argmax of the acquisition function.

c:\projects\buildots\pycode\venv\lib\site-packages\sklearn\gaussian_process\_gpr.py in fit(self, X, y)
    232             optima = [(self._constrained_optimization(obj_func,
    233                                                       self.kernel_.theta,
--> 234                                                       self.kernel_.bounds))]
    235 
    236             # Additional runs are performed from log-uniform chosen initial

c:\projects\buildots\pycode\venv\lib\site-packages\sklearn\gaussian_process\_gpr.py in _constrained_optimization(self, obj_func, initial_theta, bounds)
    501             opt_res = scipy.optimize.minimize(
    502                 obj_func, initial_theta, method="L-BFGS-B", jac=True,
--> 503                 bounds=bounds)
    504             _check_optimize_result("lbfgs", opt_res)
    505             theta_opt, func_min = opt_res.x, opt_res.fun

c:\projects\buildots\pycode\venv\lib\site-packages\scipy\optimize\_minimize.py in minimize(fun, x0, args, method, jac, hess, hessp, bounds, constraints, tol, callback, options)
    598     elif meth == 'l-bfgs-b':
    599         return _minimize_lbfgsb(fun, x0, args, jac, bounds,
--> 600                                 callback=callback, **options)
    601     elif meth == 'tnc':
    602         return _minimize_tnc(fun, x0, args, jac, bounds, callback=callback,

c:\projects\buildots\pycode\venv\lib\site-packages\scipy\optimize\lbfgsb.py in _minimize_lbfgsb(fun, x0, args, jac, bounds, disp, maxcor, ftol, gtol, eps, maxfun, maxiter, iprint, callback, maxls, **unknown_options)
    333             # until the completion of the current minimization iteration.
    334             # Overwrite f and g:
--> 335             f, g = func_and_grad(x)
    336         elif task_str.startswith(b'NEW_X'):
    337             # new iteration

c:\projects\buildots\pycode\venv\lib\site-packages\scipy\optimize\lbfgsb.py in func_and_grad(x)
    283     else:
    284         def func_and_grad(x):
--> 285             f = fun(x, *args)
    286             g = jac(x, *args)
    287             return f, g

c:\projects\buildots\pycode\venv\lib\site-packages\scipy\optimize\optimize.py in function_wrapper(*wrapper_args)
    324     def function_wrapper(*wrapper_args):
    325         ncalls[0] += 1
--> 326         return function(*(wrapper_args + args))
    327 
    328     return ncalls, function_wrapper

c:\projects\buildots\pycode\venv\lib\site-packages\scipy\optimize\optimize.py in __call__(self, x, *args)
     62     def __call__(self, x, *args):
     63         self.x = numpy.asarray(x).copy()
---> 64         fg = self.fun(x, *args)
     65         self.jac = fg[1]
     66         return fg[0]

c:\projects\buildots\pycode\venv\lib\site-packages\sklearn\gaussian_process\_gpr.py in obj_func(theta, eval_gradient)
    223                 if eval_gradient:
    224                     lml, grad = self.log_marginal_likelihood(
--> 225                         theta, eval_gradient=True, clone_kernel=False)
    226                     return -lml, -grad
    227                 else:

c:\projects\buildots\pycode\venv\lib\site-packages\sklearn\gaussian_process\_gpr.py in log_marginal_likelihood(self, theta, eval_gradient, clone_kernel)
    474             y_train = y_train[:, np.newaxis]
    475 
--> 476         alpha = cho_solve((L, True), y_train)  # Line 3
    477 
    478         # Compute log-likelihood (compare line 7)

c:\projects\buildots\pycode\venv\lib\site-packages\scipy\linalg\decomp_cholesky.py in cho_solve(c_and_lower, b, overwrite_b, check_finite)
    194     (c, lower) = c_and_lower
    195     if check_finite:
--> 196         b1 = asarray_chkfinite(b)
    197         c = asarray_chkfinite(c)
    198     else:

c:\projects\buildots\pycode\venv\lib\site-packages\numpy\lib\function_base.py in asarray_chkfinite(a, dtype, order)
    494     if a.dtype.char in typecodes['AllFloat'] and not np.isfinite(a).all():
    495         raise ValueError(
--> 496             "array must not contain infs or NaNs")
    497     return a
    498 

ValueError: array must not contain infs or NaNs
sharpe5 commented 3 years ago

I also ran into this problem. None of my demo code worked, and the error message is misleading (to say the least).

I almost completely discarded this package as a result, and I'm sure other people would hit this error and conclude that the whole is just irrepairably broken on some level. This would probably only be an issue for devs just starting out on the learning curve for this package.

A better error code would be:

ValueError: array must not contain infs or NaNs. Likely cause: function cannot return a constant.
fmfn commented 3 years ago

Likely cause: function cannot return a constant.

And how exactly is the computer supposed to know your function is a constant?? If the whole point is to handle black box functions? This error is being thrown by sklearn's call to scipy when running the gaussian process.