anyoptimization / pymoo

NSGA2, NSGA3, R-NSGA3, MOEAD, Genetic Algorithms (GA), Differential Evolution (DE), CMAES, PSO
https://pymoo.org
Apache License 2.0
2.25k stars 386 forks source link

Pymoo Dynamic Multiobjective Optimization #289

Closed bandundu closed 2 years ago

bandundu commented 2 years ago

Hello my optimization fellows,

for my thesis I'm currently implementing the CEC2018 DF Benchmark Suite in Pymoo

So far I defined following problem:

from pymoo.core.problem import Problem
import numpy as np
from numpy import sin, pi, floor, power

class DF1(Problem):
    def __init__(self, taut=30, nt=10, n_var=10):
        super().__init__(n_var=n_var, n_obj=2, n_constr=0, xl=0.0, xu=1.0)
        self.tau = 0 # generation counter
        self.taut = taut # frequency of change
        self.nt = nt # severity of change
        self.T0 = 50  # first change occurs after T0 generations
        self.t = 0 # time

    def _update_t(self):
        tau_tmp = max(self.tau + self.taut - (self.T0 + 1), 0)
        self.t = (1.0 / self.nt) * floor(tau_tmp / self.taut)

    def _calc_pareto_front(self, n_pareto_points=100):

        # create numpy array from 0 to 1 with n_pareto_points
        x = np.linspace(0, 1, n_pareto_points)

        # TODO: validate 1 - transposal
        return 1 - np.array([x, 1 - x ** self._H()]).T

    def _g(self):
        G = abs(sin(0.5 * pi * self.t))
        return G

    def _H(self):
        H = 0.75 * sin(0.5 * pi * self.t) + 1.25
        return H

    def _evaluate(self, x, out, *args, **kwargs):

        # update time instant
        self._update_t()

        G = self._g()
        H = self._H()

        F1 = []
        F2 = []

        # evaluate fitness for each individual and append to result list
        for individual in x:
            g = 1 + sum((individual[1:] - G) ** 2)
            f1 = individual[0]
            f2 = g * power(1 - (individual[0] / g), H)

            F1.append(f1)
            F2.append(f2)

        # increase generation counter
        self.tau += 1

        out["F"] = np.column_stack([F1, F2])

Now, one can test this problem with the following code. I am using the ask and tell interface as I also want to implement dynamic multi objective optimization algorithms that relocate the population after change in the environment happens.

from pymoo.algorithms.moo.nsga2 import NSGA2
from benchmarks import DF1

problem = DF1()

algorithm = NSGA2(pop_size=100)

# prepare the algorithm to solve the specific problem (same arguments as for the minimize function)
algorithm.setup(problem, ("n_gen", 500), seed=1, verbose=True)

# until the algorithm has no terminated
while algorithm.has_next():

    # ask the algorithm for the next solution to be evaluated
    pop = algorithm.ask()

    # evaluate the individuals using the algorithm's evaluator (necessary to count evaluations for termination)
    algorithm.evaluator.eval(problem, pop)

    # returned the evaluated individuals which have been evaluated or even modified
    algorithm.tell(infills=pop)

# obtain the result objective from the algorithm
res = algorithm.result()

Examining the output, one can notice that performance metrics are updated only every 30 generations, as the problems fitnessfunction changes every 30 generations.

============================================================
n_gen |  n_eval |     igd      |      gd      |      hv     
============================================================
   48 |    4800 |  0.005307900 |  0.004815726 |  0.546348513
   49 |    4900 |  0.005361608 |  0.004736356 |  0.546309798
   50 |    5000 |  0.005008034 |  0.004418473 |  0.546871336
   51 |    5100 |  0.004842350 |  0.004153299 |  0.547019545
   52 |    5200 |  0.004842350 |  0.004153299 |  0.547019545
   53 |    5300 |  0.004842350 |  0.004153299 |  0.547019545
   54 |    5400 |  0.004842350 |  0.004153299 |  0.547019545
   55 |    5500 |  0.004842350 |  0.004153299 |  0.547019545
   56 |    5600 |  0.004842350 |  0.004153299 |  0.547019545
   57 |    5700 |  0.004842350 |  0.004153299 |  0.547019545
   58 |    5800 |  0.004842350 |  0.004153299 |  0.547019545
   59 |    5900 |  0.004842350 |  0.004153299 |  0.547019545
   60 |    6000 |  0.004842350 |  0.004153299 |  0.547019545
   61 |    6100 |  0.004842350 |  0.004153299 |  0.547019545
   62 |    6200 |  0.004842350 |  0.004153299 |  0.547019545
   63 |    6300 |  0.004842350 |  0.004153299 |  0.547019545
   64 |    6400 |  0.004842350 |  0.004153299 |  0.547019545
   65 |    6500 |  0.004842350 |  0.004153299 |  0.547019545
   66 |    6600 |  0.004842350 |  0.004153299 |  0.547019545
   67 |    6700 |  0.004842350 |  0.004153299 |  0.547019545
   68 |    6800 |  0.004842350 |  0.004153299 |  0.547019545
   69 |    6900 |  0.004842350 |  0.004153299 |  0.547019545
   70 |    7000 |  0.004842350 |  0.004153299 |  0.547019545
   71 |    7100 |  0.004842350 |  0.004153299 |  0.547019545
   72 |    7200 |  0.004842350 |  0.004153299 |  0.547019545
   73 |    7300 |  0.004842350 |  0.004153299 |  0.547019545
   74 |    7400 |  0.004842350 |  0.004153299 |  0.547019545
   75 |    7500 |  0.004842350 |  0.004153299 |  0.547019545
   76 |    7600 |  0.004842350 |  0.004153299 |  0.547019545
   77 |    7700 |  0.004842350 |  0.004153299 |  0.547019545
   78 |    7800 |  0.004846134 |  0.006071532 |  0.547014882
   79 |    7900 |  0.004846134 |  0.006071532 |  0.547014882
   80 |    8000 |  0.004846134 |  0.005925610 |  0.547014882
   81 |    8100 |  0.004846134 |  0.005925610 |  0.547014882
   82 |    8200 |  0.004846134 |  0.005925610 |  0.547014882
   83 |    8300 |  0.004846134 |  0.005925610 |  0.547014882
   84 |    8400 |  0.004846134 |  0.005925610 |  0.547014882
   85 |    8500 |  0.004846134 |  0.005925610 |  0.547014882
   86 |    8600 |  0.004846134 |  0.005925610 |  0.547014882
   87 |    8700 |  0.004846134 |  0.005925610 |  0.547014882
   88 |    8800 |  0.004846134 |  0.005925610 |  0.547014882
   89 |    8900 |  0.004846134 |  0.005925610 |  0.547014882
   90 |    9000 |  0.004846134 |  0.005925610 |  0.547014882
   91 |    9100 |  0.004846134 |  0.005925610 |  0.547014882
   92 |    9200 |  0.004846134 |  0.005925610 |  0.547014882
   93 |    9300 |  0.004846134 |  0.005925610 |  0.547014882
   94 |    9400 |  0.004846134 |  0.005925610 |  0.547014882
   95 |    9500 |  0.004846134 |  0.005925610 |  0.547014882
   96 |    9600 |  0.004846134 |  0.005925610 |  0.547014882
   97 |    9700 |  0.004846134 |  0.005925610 |  0.547014882
   98 |    9800 |  0.004846134 |  0.005925610 |  0.547014882
   99 |    9900 |  0.004846134 |  0.005925610 |  0.547014882

At this point, my question.

Shouldn't the metrics still at least somewhat change every generation, as the fitness function of the DF1 Problem prefers other solutions? To me, it looks like NSGA doesn't follow the new Pareto-Front at all.

I also attached some visualizations to see my concern. visualisation.zip

Is this a bug or am I misunderstanding something and this is just how NSGA-II should work here?

I would be really thankful for any kind of help or clarification here, thanks!

blankjul commented 2 years ago

I have just found the time to work on the dynamic problem. For a sneak peak, see: http://develop.pymoo.org/algorithms/moo/dnsga2.html

The DF problems are available in the current development branch: https://github.com/anyoptimization/pymoo/blob/develop/pymoo/problems/dynamic/df.py I am currently working on the 0.6.0 release which will be available soon (hopefully).

The implementation is very similar to yours. I am searching for someone to test the implementation and perform a benchmark. Would this align with your research? Are you interested?

bandundu commented 2 years ago

Hi! This would be excellent for me and is exactly where my research is heading. I'll gladly help wherever I can, I only might need guidance here and there.

As a first, how would one install the latest development branch?

With this i am getting following error Bildschirmfoto vom 2022-07-05 16-06-08

With this I am only getting Bildschirmfoto vom 2022-07-05 16-08-20

which does not contain the newest problems and algorithm.

Thank you very much!

bandundu commented 2 years ago

I was able to install the latest development branch with

git clone https://github.com/anyoptimization/pymoo
cd pymoo
git checkout develop
pip install .
blankjul commented 2 years ago

Have you found the time to check the DF problems yet? I am also more than happy to have a meeting to talk about features for dynamic optimization in general.