Closed flippercy closed 4 years ago
How many epochs and iterations were completed before this error occurred? This error can occur when the the lengthscales / nugget of the GP can't be optimized because the data passed is too dense.
Sometimes it happened after 20ish epochs and during the second iterations; sometimes it took longer.
What do you mean by saying that the data is too 'dense', please?
Finding the optimal length-scales in a Gaussian Process requires inverting a matrix of the data - however this is impossible if two rows or columns in the matrix are the same (or very close together, which will eventually happen if the bayesian optimization has run its course and is sampling points very close together). It's very likely that there is no point in continuing optimization in your scenario, since it has run for 20 epochs and there are only 3 input dimensions.
One way around this is to pass your own nugget factor to the kernel. This will add a number (the nugget) to the diagonal of the matrix so it is not singular. Any parameters passed to the dots method in bayesOpt() will be passed to DiceKriging::km(). The options are very problem specific, I dont' know if I'll be much more help. Sending nugget.estim=TRUE
to bayesOpt() tends to fix these singularity problems:
https://cran.r-project.org/web/packages/DiceKriging/DiceKriging.pdf
I was using ParBayesianOptimization to search for the hyperparameters for xgboost with gblinear as the booster. The search space is specified as:
nrounds = c(1L, 100L), lambda= c(0.1, 10), alpha=c(0.1, 10)
I kept getting the following error message:
Error encountered while training GP: <the leading minor of order 9 is not positive definite>
Could you tell me what it means? Thank you!