DEAP / deap

Distributed Evolutionary Algorithms in Python
http://deap.readthedocs.org/
GNU Lesser General Public License v3.0
5.66k stars 1.11k forks source link

multi-objective genetic algorithm #505

Open atefeh-mehrabi opened 3 years ago

atefeh-mehrabi commented 3 years ago

Hello,

I have been using ga examples like nsgaII to do some hyperparameter tuning. I have four parameters and 5 objective functions.

I use sth like below to minimize 2 objective functions f1 and f2 in the code. However, sample are exactly identical with single objective ga or multiple objective. Changing what evalIndividual() function shown below returns does not change the samples during search. Is there anything else that I am missing with multi-objective ga? I would appreciate your input on this. Thanks.

creator.create("FitnessMulti", base.Fitness, weights=(-1.0,-1.0)) creator.create("Individual", list, fitness=creator.FitnessMulti) toolbox = base.Toolbox() toolbox.register("unroll", random.randint, 1, 9) toolbox.register("individual", tools.initCycle, creator.Individual, (toolbox.unroll, toolbox.unroll, toolbox.unroll, toolbox.unroll), n=1) toolbox.register("population", tools.initRepeat, list, toolbox.individual)

////////////////////////// def evalIndividual(individual): ..... return f1, f2 /////////////////////////

toolbox.register("evaluate", evalIndividual) toolbox.register("mate", tools.cxTwoPoint) toolbox.register("mutate", tools.mutFlipBit, indpb=0.05) toolbox.register("select", tools.selTournament, tournsize=3)

bjuergens commented 3 years ago

if I understand this correct, the default selection algorithm in deap is lexicographical (here it says so in the source code), which means that it only considers the value of the the 2nd element when the first element is identical. So if your first objective is continuous, then all your other object will be ignored most of the time. (Please correct me if I am wrong).

In my personal oppinion it would probably be better not to use python's tuple-comparision when comparing fitnesses, and instead compare the sums of each tuple.

I little bit off topic: For my multiple-objective stuff, I replace the fitness with a rank transformation of my objectives. First I calculate the rank for each singe objective. Then I add the individual ranks together and use the resulting value as fitness. (I also apply different weights to the objectives by multiplying each objective with the weight I have selected for it)

atefeh-mehrabi commented 3 years ago

@bjuergens Thanks for your response.

My objective functions' nature is such that I cannot combine them all into a meaningful single function. I changed the select to NSGA2, it did not make a big difference either. Samples are almost equal among different objectives. Is there any way I could use what exists in toolbox and put multiple objective functions separately but they get optimized simulineasouly?

Also would you mind please giving me a pointer to the code on how you implement your suggestion? or which example does that?

Thanks!

bjuergens commented 3 years ago

I am just guessing here, but I think the preferred way would be to create a subclass of Fitness and override the comparator-methods with more meaningful ones.

So something like

import deap
class MyFitness(deap.base.Fitness):
    def _wsum(self):
        sum=0
        for v in self.wvalues:
             sum+=v
        return sum

    def __le__(self, other):
        return self._wsum() <= other._wsum()

    def __lt__(self, other):
        return self._wsum() < other._wsum()

    def __eq__(self, other):
        return self._wsum() == other._wsum()

edit: I didn't notice you said you work with NSGA2. Disregard everything I said. My best guess is, that the "dominate"-methode from the Fitness-class could be useful.

NewJerseyStyle commented 3 years ago

I was reading the docs and find this may be useful

https://deap.readthedocs.io/en/master/tutorials/basic/part1.html#fitness

creator.create("FitnessMulti", base.Fitness, weights=(-1.0, 1.0))

This code produces a fitness that minimizes the first objective and maximize the second one. The weights can also be used to vary the importance of each objective one against another. This means that the weights can be any real number and only the sign is used to determine if a maximization or minimization is done. An example of where the weights can be useful is in the crowding distance sort made in the NSGA-II selection algorithm.

atefeh-mehrabi commented 3 years ago

@bjuergens Thanks so much for sharing this. I used sth similar to NSGAII and chose to minimize all objectives by using -1.0 as weight. Seems like it works better. Thank you.