Closed ghost closed 3 years ago
I see the problem, somehow the scoring function was sampled at the same spot more than once. This causes the GP to fail. I'll start working on this.
initPoints
are the number of iterations to run before any Gaussian Process is trained - the GP needs something to work with before it can be built. gsPoints
is a little more complicated. Finding the global optimum of the acquisition function on a GP is not trivial - you need to use Newton's method to find many local optimums, and choose the best one from those. This is assumed to be the global optimum. gsPoints
determines how many times we run Newton's method from different starting positions. The more points you same, the more confident you can be that you found the global optimum.
In the meantime, there are some things you can do to try to get around this. There really isn't a reason to run in parallel, the model already runs pretty fast. If the error is caused by the parallel implementation, that might help. To set to sequential, just set iters.k=1
and parallel=FALSE
.
This code with BostonHousing
is just a simple trial, and next time I will start with my own bigger data. That is why I want to confirm whether the trial code works properly with parallel computation, although it is fast. And I understood that the function bayesOpt
utilizes Newton's method to search the global optimization. Thank you!
This is exactly the same issue I reported before; good to know that we are working toward a solution!
This happens when: 1) iters.k > 1 2) A selected local optimum was at the bound limits 3) The noise added simply returned the same values, since the local optimum was at the bounds already.
Fixed with commit 0c7bef7
Hello. I am interested in finding the best parameter of machine learning by applying parallel bayesian optimization. I have run the code with SVM Radial regression to the built-in dataset,
Bostonhousing
. However, The iteration often stopped before full iteration(iter.n
+iter.k
) with a following error announcement.It seems this announcement is not related to other bayesian parameters(cores, acquisition function, etc.), and the stop orders vary in even trials with the same parameter setting. Is this a real error or just an early-stop notification (that more parameter searches are meaningless)? If the latter is right, is it because
BostonHousing
data is small or because the SVM parameter range is small?Here is the result of Bayesian Utility per Epoch with 10 cores.
And Here is the result of parameter search range of SVM RBF kernel.
[Additional Question] What is difference between
initPoints
andgsPoints
inbayesOpt
function?