Open ForceBru opened 1 year ago
Looks like I got it. Apparently, when I specify one initial point, the entire population will be just copies of this point:
And then there's probably not enough variability for mutation or crossover to change anything, so the population doesn't change.
If I use BoxConstraints
, then even DE
starts working.
Thanks for testing package in such mode. I wouldn't never try to use this algorithms without specific parameters. But, I understand that for someone new trying evolutionary optimization the experience can be frustrating.
You are correct, default parameters are useless. But there is not way to set default parameters for any model because operators are population dependent. If population is represented by binary strings, you need specific to binary string operations, same for numerical functions, and so on. I think the best way is to terminate optimization (with some useful message) when no-ops operators are used. Same problem with the initial population. Most of the evolutionary algorithms require sufficient randomness in population for proper optimization. Having all population with the same value totally defeats the optimization technique. Adding some noise for the point initialized population may work much better.
If you use BoxConstraints
in optimization call, then the population is sampled within the box which beneficially reflects on optimization results.
default parameters are useless ... there is not way to set default parameters for any model because operators are population dependent
Maybe there shouldn't be any default parameters then? Having "useless" default parameters is:
Evolutionary.optimize(x->-sum(x), BitVector(zeros(3)), GA())
says that [0,0,0]
is the minimum, but it's not.The no-argument constructor could throw an error, for example:
DE() = error("There is not way to set default parameters for any model because operators are population dependent, so please set parameters manually.")
It's especially strange with GA
, where the default mutation and crossover operations are no-ops, but mutation and crossover seem to be the entire point of genetic algorithms.
On the other hand, the default DE()
seems to work fine when I use BoxConstraints
, so default params don't seem all that useless...
Having all population with the same value totally defeats the optimization technique.
Does it mean that optimize
methods that accept the indiv
parameter aren't particularly useful? With ordinary gradient-based methods, the initial guess is extremely important, so my first instinct was to specify the initial individual.
Also the documentation recommends this: Evolutionary.optimize(f, x0, CMAES())
, and it actually works with CMAES
, even though the population should consist of copies of x0
.
It's especially strange with GA, where the default mutation and crossover operations are no-ops, but mutation and crossover seem to be the entire point of genetic algorithms.
Exactly my point, performing evolutionary optimization correctly is not only about initial guess, but also about which mutation operations and how they applied to population, i.e. rates.
Does it mean that optimize methods that accept the indiv parameter aren't particularly useful? With ordinary gradient-based methods, the initial guess is extremely important, so my first instinct was to specify the initial individual.
In current implementation, it is problematic. I think more randomness needs to be added in such case.
Also the documentation recommends this: Evolutionary.optimize(f, x0, CMAES()), and it actually works with CMAES, even though the population should consist of copies of x0.
By default, CMAES initializes parameters with a very convoluted procedure, that was refined for many years of research. CMAES become very fragile with incorrect parameters, and it requires good understanding of algorithm and problem to tune in algorithm for the best performance.
I came across the same issue.
I suggest you change the (other than that, really helpful) tutorial, so as not to include examples that use default parameters in GA and DE. These examples were the first that I tried to familiarize myself with the package, and it was a bit frustrating to get weird results even if copying the code. There was some time until I realised that there might be a problem with the parameters and not with the syntax I used.
Same here. In my case I was using Evolutionary.jl
with Optimization.jl
, where you have to set a x0
when defining the OptimizationProblem
.
Got it to work by initializing with an initial_population
rather than an individual x0
. I sampled from a uniform distribution, but I could also imagine just adding randn
to x0
.
NP = 100
de = Evolutionary.DE(populationSize=NP)
lb, ub = [0 1]
U = Uniform(lb, ub)
x0 = Evolutionary.initial_population(de, [rand(U, 1) for i in 1:NP])
f = OptimizationFunction(foo)
prob = Optimization.OptimizationProblem(f, x0, p, lb=lb, ub=ub)
sol = solve(prob, de)
TL;DR
GA()
andDE()
don't move away from the initial point, say that any initial point is the optimum and report convergence, even though the algorithm isn't anywhere near the optimum.I'm new to this, so this entire issue might be stupid, but I can't get the algorithms to work even with default settings, I can't even get started with the most basic things. Maybe the defaults could be adjusted somehow?
Basic example
Try to minimize
f(x) = x^2
:No,
90^2 = 8100
is most definitely not the minimum ofx^2
.I can fiddle with the optimizer's settings, but it still doesn't move from the initial point:
All of these runs also report convergence, but the output is too long.
I eventually got the genetic algo to work after specifying
mutation=gaussian(), crossover=uniformbin()
(the defaults are apparently no-ops, but having no mutation and no crossover seems to defeat the purpose of having a genetic algorithm?):However, I couldn't get differential evolution to work, even for this simple function. Below are examples taken from the docs that don't seem to work either.
GA
exampleThis is taken from https://wildart.github.io/Evolutionary.jl/dev/tutorial/#Obtaining-results.
So apparently, the minimum is
f([0,0,0]) = 0
. However, I can get a lower value:f([1,1,1]) = -3
. In fact,GA
seems to accept any initial value as the solution:DE
algorithmI took the target function from this page: https://wildart.github.io/Evolutionary.jl/dev/constraints/.
Then I use the genetic algorithm as shown on that page:
If I run this multiple times (
GA
seems to be randomized, so I wanted to draw more samples), I get about[1, 3]
on average. So far so good.Now try the default
DE()
with the same function and the same starting point[0., 0.]
:Change the starting point:
DE
doesn't seem to care and says that the starting point is the optimum, whatever the starting point is.GA
againNow try the same function as above with the default
GA
:Same as the default
DE
: it thinks that the initial point is the optimum, whatever the initial point is.