Stefan-Endres / shgo

Simplicial Homology Global Optimization
https://stefan-endres.github.io/shgo
MIT License
44 stars 12 forks source link

Question (theory) #35

Closed microprediction closed 3 years ago

microprediction commented 3 years ago

It seems plausible that an SHGO variant might be capable of finding good plateaus, which is to say relatively stable local minima where there is no particularly steep dropoff nearby. I realize that isn't a rigorous definition, but do you have thoughts on how this might be hacked to achieve the intent, approximately?

alchemyst commented 3 years ago

It depends very strongly on your assumptions about the derivative magnitude limits. We assume Lipschitz continuity as a minimum for the method. If you know more, like the value of the maximum derivative it becomes easier to bound the minima given the simplexes. It might be worth trying to disable the local minimisation routine and checking the function values only at the simplex evaluation points. If you have the ability to calculate a derivative (which shgo doesn't rely on) you might be able to cheaply bound the minima that way as well.

Stefan-Endres commented 3 years ago

To add to what @alchemyst had said you can specify options = {'minimize_every_iter': False} to the options argument to disable local minimisation except for a roundoff at the final iteration.

Using the Decimals class, if possible, should improve the ability of SHGO to detect stable local minima at a finer tolerance. Adding custom tolerance arguments through minimizer_kwargs to the local solvers could improve the round-off, in particular for the default solver I would recommend changing 'ftol' and 'eps' values to better fit numerical tolerances of the local optimisation problem within the "plateau" neighbourhood of the local minimizer (starting point near the true local minima). Here I would add a factor based on the difference in the objective function output in the neighbourhood of a local minimizer.

When SHGO does not find a feasible minima at all it will search for the "best" sampling vertex, the set of nearest neighbouring points can be retrieved using SHc.HC.V[tuple(res.)].nn (where SHc is an initiated SHGO class object). So then this point and the constraints constructed from it can be used as a starting point again ideally with the same arguments for the local solver as discussed in the previous paragraph.