NiaOrg / NiaPy

Python microframework for building nature-inspired algorithms. Official docs: https://niapy.org
https://niapy.org
MIT License
261 stars 80 forks source link

jSO algorithm #379

Open cola9 opened 2 years ago

cola9 commented 2 years ago

Implementing jSO algorithm to existing library.

firefly-cpp commented 2 years ago

@cola9 is working on the implementation of jSO algorithm proposed by Brest et al.

Reference: Brest, Janez, Mirjam Sepesy Maučec, and Borko Bošković. "Single objective real-parameter optimization: Algorithm jSO." 2017 IEEE congress on evolutionary computation (CEC). IEEE, 2017.

cola9 commented 2 years ago

I'm currently reviewing your niapy programming library to understand how I need to implement the jSO algorithm.

I took the DE algorithm as an example and looked at the whole implementation of the class "DifferentialEvolution"(DE), which is clear to me.

I also looked at an example of using the DE algorithm and there are a couple of things that are not clear to me and I would like to ask you if you could explain them to me.

In line 14 algo = DifferentialEvolution(population_size=50, differential_weight=0.5, crossover_probability=0.9) I see that you initialize the algorithm with the necessary parameters. Then create a Task on line 16 and use it on line 17 when calling algo.run(task). I don't undestand how you run DE algorithm and all the necessary functions ("evolve", "selection", "run_iteration", . . . ) with the "run" method (from line 17)?

zStupan commented 2 years ago

run_iteration is the main loop of the algorithm. Internally the run method looks something like this:

def run(self, task):
    population, fitness, params = self.init_population(task)
    best_individual, best_fitness = self.get_best(population, fitness)
    while not task.stopping_condition():
        population, fitness, best_individual, best_fitness, params = self.run_iteration(task, population, fitness, best_individual, best_fitness, params)
        task.next_iter()
    return best_individual, best_fitness

The evolve, selection and post_selection methods of DifferentialEvolution are called in run_iteration:

https://github.com/NiaOrg/NiaPy/blob/dbf2e3f05c2fff8f273f5952b9fb21aac5541e37/niapy/algorithms/basic/de.py#L392-L422

DifferentialEvolution and related algorithms differ from the rest of the algorithms in the framework in that they use the Individual class to represent an individual, instead of a numpy array. The individual class has two attributes: x - a numpy array representing the solution vector, and f representing the individual's fitness. It also has the method evaluate(task), which puts the solution vector x in bounds of the problem and calculates the new fitness value. You can extend the Individual class to add extra parameters to it.

For other parameters that change during the runtime of the algorithm (that are not connected to the individual), you can add them to the params dict by overriding the algorithm's init_population method like so:

def init_population(self, task):
    population, fitness, params = super().init_population(task)
    params.update({'param1': 12, 'param2': 34})
    return population, fitness, params

You can then access and modify those parameters in the run_iteration method like so:

def run_iteration(population, fitness, best_individual, best_fitness, params):
    param1 = params.pop('param1')
    param2 = params.pop('param2')
    # ...
    param1 = 8
    param2 = 42
    return population, fitness, best_individual, best_fitness, {'param1': param1, 'param2': param2}
cola9 commented 2 years ago

Thank you very much for very detailed explanation. The process is now much more clear. So if I understand correctly the run method (shown below) is already written (somewhere?) and can not be changed?

def run(self, task):
    population, fitness, params = self.init_population(task)
    best_individual, best_fitness = self.get_best(population, fitness)
    while not task.stopping_condition():
        population, fitness, best_individual, best_fitness, params = self.run_iteration(population, fitness, best_individual, best_fitness, params)
        task.next_iter()
    return best_individual, best_fitness

I am asking this because I saw that task.stopping_condition() has predefined conditions:

    return (self.evals >= self.max_evals) or (self.iters >= self.max_iters) or (self.cutoff_value * self.optimization_type.value >= self.x_f * self.optimization_type.value)

Maybe this conditions won't suite my jSO algorithm.

zStupan commented 2 years ago

No problem. Yes, the run method is implemented in the Algorithm base class. There is no need to change it. You only need to override run_iteration and init_population, if there's some variable run time parameters.

The stopping condition can be either max_iters, max_evals or cutoff_value. max_iters controls the maximum number of iterations (generations) of the algorithm, while max_evals is the maximum number of fitness function evaluations. cutoff_value means the algorithm will run until a fitness value <= cutoff_value is found.

I see the jSO has some equations that use max_nfes, which is max_evals in our framework, you can access it in run_iteration with task.max_evals, and you can get the current number of function evaluations with task.evals (these are attributes of task). If the stopping condition is max_iters, I guess you could compute max_evals from max_iters and population_size.

cola9 commented 2 years ago

Thank you very much. I think that now I understand everything and can start with implementation of jSO algortihm.

firefly-cpp commented 2 years ago

Hi @cola9!

How is the project coming along?