stevengj / nlopt

library for nonlinear optimization, wrapping many algorithms for global and local, constrained or unconstrained, optimization
Other
1.84k stars 568 forks source link

Hooke-Jeeves algorithm #306

Open sovrasov opened 4 years ago

sovrasov commented 4 years ago

Hi @stevengj. Are you interested in adding of the Hooke-Jeeves method? nlopt already contains similar heuristic-based stuff (Nelder-Mead and Sbplx), but still. One of my colleagues complained that the nlopt's Nelder-Mead is much slower than my implementation of HJ on the 20d generalized Rosenbrock function. I haven't benckmark the methods by myself yet. To do that correctly I'd rather have both the methods implemented in nlopt, so HJ would be a side-product of that comparison.

stevengj commented 4 years ago

In general, I'm open to including new algorithms, but I'd typically like to see a reasonable problem where the new algorithm is better than anything current in NLopt, or otherwise adds some feature not available in the existing algorithms. So it would be good to compare HJ to other derivative-free algorithms in NLopt too.

However, you don't need to implement the method in NLopt to do the comparison. For optimization methods like this, a good metric is the number of objective-function evaluations required to reach minimum+ε for some given ε (for test problems with a known minimum). (In NLopt, use nlopt_set_stopval.) Not wall-clock-time. This makes it possible to compare algorithms independent of language, compiler, etcetera.

(Nelder-Mead itself is mainly in NLopt for comparison purposes, as is Praxis — these are not algorithms I would normally recommend for real problems.)

justinmcgrath commented 4 years ago

I have a problem that Hooke-Jeeves method works well with. It's part of a optimization package, but I like the interface of NLopt considerably more. However, I can't find something in NLopt that works as well. What is the comparable algorithm in NLopt? (I'm a biologist and I am not familiar with the options). I'd like to try something from NLopt, and I think it would be simple for me to do a comparison of the number of function calls between NLopt and what I'm using.

Simulated annealing also works well for my problem, but the software I'm using (also different from the Hooke Jeeves software) doesn't have working stopping conditions, so it runs far longer than I need it to. As above, is there a similar algorithm in NLopt? I tried CRS, which seems roughly similar, but it didn't work well.

stevengj commented 4 years ago

There are a bunch of global-optimization algorithms described here: https://nlopt.readthedocs.io/en/latest/NLopt_Algorithms/ … it's hard to say more without knowing anything about your problem.

Useful stopping criteria are always a problem for global optimization.