Closed kyu999 closed 8 years ago
I see. Thanks!
Many of our current multivariate solvers are either 1) quasi-Newton (various gradient descent solvers and (L-)BFGS) or 2) Newton (line search or trust region). However, we also have: Nelder-Mead, Particle Swarm, and Simulated Annealing.
If you can fit it in an optimize(f, x, options)
call, we'll be happy to accept a PR :)
scratch that, I was thinking about something else. ahwillia is correct that it is probably better suited in another package.
Yes there are some obvious gradient-free solvers like Nelder-Mead, but for the most part very different approaches than GA. That being said, GA isn't all that different from Particle Swarm... Why do we have PSO here and do people use it? Should it be ported over to a different package?
I have never really cared to read anything about genetic algorithms, so I might have the wrong idea, and it could be possible that a genetic algorithm implementation could fit in Optim.
Particle Swarm is here because it might be of interest to some people to have easy access to other solvers than gradient descent, BFGS, and so on. It's not that old in Optim either, so I don't think it'll go anywhere in the near future. Since the domain of Optim is to a large extent unconstrained, continuous variable optimization, I don't think PS is a very weird solver to include, but others might disagree.
Generally, I think it is fair to say that a solver that can be easily run "successfully" with our quite basic API is welcome. The reason why I think (thought? someone can prove me wrong) GA was maybe a worse fit, is because I had a feeling that it would require quite a bit of customization for each given problem. The discussion is sort of related to the question: should we have stuff like "multistart" algorithms? From a maintenance viewpoint, probably not, but some users might want to have easy access to it from Optim. What about the constrained optimization work that we have in a PR? Maybe not even that.
Perhaps some things should go into separate packages, but I do think that users of Optim should be able to call that sort of stuff from Optim using something that looks like our current API. In that sense we're sort of currently in a hybrid state where we a) provide a single API for several different sovlers and b) we also maintain the solvers in the same package. Who knows, maybe in the future we will refactor a lot of the solver code out (to QuasiNewton.jl
, Newton.jl
, GeneticAlgorithms.jl
(including PSO), ConstrainedOptim.jl
, or what do I know) and use Optim.jl to organize them all.
I think we should distinguish the Optim.jl API from the Optim.jl codebase. I would love for us to use a standardized API for everything from genetic algorithms to constrained quasi-Newton methods, but I would be hesitant to keep pulling in new functionality without gaining a lot more developers who are putting in 3+ hours of work per week on Optim. A long time Jiahao Chen made a point that all code should be considered a form of debt for a project -- the only assets are engineering hours per week. So I'm very cautious to pull in new code.
I think we should distinguish the Optim.jl API from the Optim.jl codebase. I would love for us to use a standardized API for everything from genetic algorithms to constrained quasi-Newton methods, but I would be hesitant to keep pulling in new functionality without gaining a lot more developers who are putting in 3+ hours of work per week on Optim. A long time Jiahao Chen made a point that all code should be considered a form of debt for a project -- the only assets are engineering hours per week. So I'm very cautious to pull in new code.
I think this is the answer that fixes this issue. If someone wants to contribute and maintain a genetic solver, they can submit a PR or open a new issue with their specific plans.
Sorry for this message as this issue is closed, but if you really have plans to include genetic algorithms with binary, floating-point, and permutation coding schemes, different kind of crossover and mutation operators, compact genetic algorithms, etc., I can help. @pkofod
No, I believe this package is limited to algorithms that use gradient information. See BlackBoxOptim.jl for a package that implements metaheuristics like genetic algorithms.