Open ZedThree opened 2 years ago
@ZedThree This one is a little confusing - I think it's a problem with how the GpRegressor
object that's held inside the instance of GpOptimizer
is parsing the input data.
For example, if I take this demo script in the repo, and I swap the kernel to SquaredExponential() + WhiteNoise()
, it fails in the way I'd expect (because CompositeCovariance
is missing some methods required by GpOptimizer
):
Traceback (most recent call last):
File "/home/cbowman/inference-tools/demos/scripts/GpOptimiser_demo.py", line 116, in <module>
new_x = GP.propose_evaluation()
File "/home/cbowman/inference-tools/inference/gp.py", line 829, in propose_evaluation
proposed_ev, max_acq = self.multistart_bfgs()
File "/home/cbowman/inference-tools/inference/gp.py", line 802, in multistart_bfgs
results = [self.launch_bfgs(x0) for x0 in starting_positions]
File "/home/cbowman/inference-tools/inference/gp.py", line 802, in <listcomp>
results = [self.launch_bfgs(x0) for x0 in starting_positions]
File "/home/cbowman/inference-tools/inference/gp.py", line 790, in launch_bfgs
return fmin_l_bfgs_b(
File "/home/cbowman/.local/lib/python3.9/site-packages/scipy/optimize/lbfgsb.py", line 197, in fmin_l_bfgs_b
res = _minimize_lbfgsb(fun, x0, args=args, jac=jac, bounds=bounds,
File "/home/cbowman/.local/lib/python3.9/site-packages/scipy/optimize/lbfgsb.py", line 306, in _minimize_lbfgsb
sf = _prepare_scalar_function(fun, x0, jac=jac, args=args, epsilon=eps,
File "/home/cbowman/.local/lib/python3.9/site-packages/scipy/optimize/optimize.py", line 261, in _prepare_scalar_function
sf = ScalarFunction(fun, x0, args, grad, hess,
File "/home/cbowman/.local/lib/python3.9/site-packages/scipy/optimize/_differentiable_functions.py", line 140, in __init__
self._update_fun()
File "/home/cbowman/.local/lib/python3.9/site-packages/scipy/optimize/_differentiable_functions.py", line 233, in _update_fun
self._update_fun_impl()
File "/home/cbowman/.local/lib/python3.9/site-packages/scipy/optimize/_differentiable_functions.py", line 137, in update_fun
self.f = fun_wrapped(self.x)
File "/home/cbowman/.local/lib/python3.9/site-packages/scipy/optimize/_differentiable_functions.py", line 134, in fun_wrapped
return fun(np.copy(x), *args)
File "/home/cbowman/.local/lib/python3.9/site-packages/scipy/optimize/optimize.py", line 74, in __call__
self._compute_if_needed(x, *args)
File "/home/cbowman/.local/lib/python3.9/site-packages/scipy/optimize/optimize.py", line 68, in _compute_if_needed
fg = self.fun(x, *args)
File "/home/cbowman/inference-tools/inference/acquisition.py", line 99, in opt_func_gradient
dmu, dvar = self.gp.spatial_derivatives(x)
File "/home/cbowman/inference-tools/inference/gp.py", line 380, in spatial_derivatives
A, _ = self.cov.gradient_terms(q[0, :], self.x, self.cov_hyperpars)
File "/home/cbowman/inference-tools/inference/covariance.py", line 32, in gradient_terms
raise NotImplementedError(
NotImplementedError:
Gradient calculations are not yet available for the
<class 'inference.covariance.CompositeCovariance'> covariance function.
Process finished with exit code 1
This means once the bug is fixed, it'll still fail but at least with a proper error message! I can implement the missing code to get GpOptimizer
to work with a composite covariance function if it's useful to you though - I've been meaning to do it for a while.
I can try to reproduce the bug by playing around with the input data shape / type, but if you could send me a script which crashes with the bug that would be easiest.
I'll see if I can reduce my script down a bit -- the objective function is a Fortran code, so it's not something straight forward to share.
This was just something I was playing about with, so it's not terribly urgent.
If I try to use a kernel like
composite_kernel = SquaredExponential() + WhiteNoise()
withGpOptimizer
, the kernel evaluations have different shapes and I get the following traceback: