Closed andresliszt closed 3 years ago
Yes, this is the expected behavior of an optimization run where no feasible solution could be found.
Using a standard implementation of a GA without any customizations and intending to solve an optimization problem with 4000 variables needs more than 100 generations (probably a lot more depending on the problem complexity).
You can access the least infeasible solution found by res.algorithm.opt[0]
or set return_least_infeasible=True
when calling the minimize function.
You might also want to do some hyperparameter optimization and trying different combinations regarding, pop_size
, n_offsprings
, eta
, and prob
. Plotting the convergence of the algorithm will help you to compare the different configurations of an algorithm.
Thank you!.Yes that was the problem, I thought there was always a solution (feasible or not). It was just a test with that parameter and I hoped the solution was far from optimal! Thank you very much!
I'm solving a multi-objective problem with several variables. When I use a small number of variables the problem is solved without problems, but when I use a large number of variables (arround 4000) I'm not getting results. That is
res.X
,res.F
andres.G
areNone
.Below is my custom problem class and a function that I'm using with testing purposes (I will write a better one) to make the algorithm class ChungPymooProblem(ProblemBase, Problem): """Implementación del paper de Chung en
pymoo
."""def make_algorithm_nsga2(problem: ChungPymooProblem) -> NSGA2: """Algoritmo NSGAII, con configuración por defecto.
Im solving with
res = minimize(problem, algorithm, termination, seed=1, save_history=True, verbose=True)
, wheretermination
is given by the pymoo functionget_termination("n_gen", 100)