anyoptimization / pymoo

NSGA2, NSGA3, R-NSGA3, MOEAD, Genetic Algorithms (GA), Differential Evolution (DE), CMAES, PSO
https://pymoo.org
Apache License 2.0
2.25k stars 386 forks source link

Result of ``pymoo.optimize.minimize`` has None attributes for variables, objectives and constraints #129

Closed andresliszt closed 3 years ago

andresliszt commented 3 years ago

I'm solving a multi-objective problem with several variables. When I use a small number of variables the problem is solved without problems, but when I use a large number of variables (arround 4000) I'm not getting results. That is res.X, res.F and res.G are None.

Below is my custom problem class and a function that I'm using with testing purposes (I will write a better one) to make the algorithm class ChungPymooProblem(ProblemBase, Problem): """Implementación del paper de Chung en pymoo."""

def __init__(
    self,
    demand_for_skus: np.array,
    cluster_amount_accomodate: np.array,
    occurrence_matrix: np.array,
):
    super().__init__(
        demand_for_skus, cluster_amount_accomodate, occurrence_matrix
    )

    super(  # pylint: disable=bad-super-call, unexpected-keyword-arg
        ProblemBase, self
    ).__init__(
        n_var=self.n_skus * self.n_clusters + 1,
        n_obj=2,
        n_constr=2 * self.n_clusters + self.n_skus,
        xl=0,
        xu=np.array(
            [
                self.BIG_CONSTANT_FOR_UNBOUNDED,
                *self.n_skus * self.n_clusters * [1],
            ]
        ),
        elementwise_evaluation=True,
    )

def _evaluate(self, x, out, *args, **kwargs):
    """Evalua el problema según construcción de ``pymoo``

    En este método se definen las funciones objetivos y los
    constraints. La sintaxís de implementación es netamente debido a
    ``pymoo``

    """

    f1 = x[0]

    matrix_vars = self.list_to_matrix(x[1:], self.n_skus)

    f2 = -sum(
        self.quadratic_form(self.occurrence_matrix, var_list)
        for var_list in matrix_vars
    )

    out["F"] = [f1, f2]

    out["G"] = [
        *self.cluster_storage_capacity_constraint(matrix_vars),
        *self.cluster_pickup_frequency_constraint(matrix_vars, x[0]),
        *self.sku_only_in_one_cluster_constraint(matrix_vars.transpose()),
    ]

def make_algorithm_nsga2(problem: ChungPymooProblem) -> NSGA2: """Algoritmo NSGAII, con configuración por defecto.

:param problem: Objeto problema necesario para el constructor.

"""

mask = ["real", *(problem.n_var - 1) * ["int"]]

sampling = MixedVariableSampling(
    mask,
    {
        "real": get_sampling("real_random"),
        "int": get_sampling("int_random"),
    },
)

crossover = MixedVariableCrossover(
    mask,
    {
        "real": get_crossover("real_sbx", prob=1.0, eta=3.0),
        "int": get_crossover("int_sbx", prob=1.0, eta=3.0),
    },
)

mutation = MixedVariableMutation(
    mask,
    {
        "real": get_mutation("real_pm", eta=3.0),
        "int": get_mutation("int_pm", eta=3.0),
    },
)

algorithm = NSGA2(
    pop_size=100,
    n_offsprings=10,
    sampling=sampling,
    crossover=crossover,
    mutation=mutation,
    eliminate_duplicates=True,
)

return algorithm

Im solving with

res = minimize(problem, algorithm, termination, seed=1, save_history=True, verbose=True), where termination is given by the pymoo function get_termination("n_gen", 100)

blankjul commented 3 years ago

Yes, this is the expected behavior of an optimization run where no feasible solution could be found.

Using a standard implementation of a GA without any customizations and intending to solve an optimization problem with 4000 variables needs more than 100 generations (probably a lot more depending on the problem complexity). You can access the least infeasible solution found by res.algorithm.opt[0] or set return_least_infeasible=True when calling the minimize function.

You might also want to do some hyperparameter optimization and trying different combinations regarding, pop_size, n_offsprings, eta, and prob. Plotting the convergence of the algorithm will help you to compare the different configurations of an algorithm.

andresliszt commented 3 years ago

Thank you!.Yes that was the problem, I thought there was always a solution (feasible or not). It was just a test with that parameter and I hoped the solution was far from optimal! Thank you very much!