Related to #103, I've read some papers about the Nelder-Mead algorithm and implemented a variant called Adaptive Nelder-Mead Simplex (ANMS) algorithm.
You can find the package at https://github.com/bicycle1885/ANMS.jl.
I hope this is merged to Optim.jl, so I compared the performance.
ANMS.jl was faster and needed less memories in all cases, and the minimal function values were comparable.
Moreover, ANMS.jl was able to minimize high dimensional (n > 20) Rosenbrock functions well.
Related to #103, I've read some papers about the Nelder-Mead algorithm and implemented a variant called Adaptive Nelder-Mead Simplex (ANMS) algorithm. You can find the package at https://github.com/bicycle1885/ANMS.jl.
I hope this is merged to Optim.jl, so I compared the performance.
Compared functions were
dot(x - 1, x - 1)
with five different dimensions (n = [2, 5, 10, 20, 50]).
The result is as follows:
quadratic function:
rosenbrock function:
ANMS.jl was faster and needed less memories in all cases, and the minimal function values were comparable. Moreover, ANMS.jl was able to minimize high dimensional (n > 20) Rosenbrock functions well.
The benchmark script and results are available at https://gist.github.com/bicycle1885/d0aaace2eab0ba9eec52. Please note that ANMS.jl was patched (https://gist.github.com/bicycle1885/d0aaace2eab0ba9eec52#file-anms-patch) to match an initial simplex and convergence criterion to those of Optim.jl.
I want to know what you think about my proposal and if you think more thorough benchmarks are needed, I'll do them as well.
Thank you.
Compared commits and benchmark environment
ANMS.jl: 8734a0a + patch Optim.jl: 3ccbae7ec8b3c908eef900df5142c0bd660617f3