esa / pygmo2

A Python platform to perform parallel computations of optimisation tasks (global and local) via the asynchronous generalized island model.
https://esa.github.io/pygmo2/
Mozilla Public License 2.0
436 stars 56 forks source link

How to specify an initial starting point in Pygmo #65

Closed chuongask closed 3 years ago

chuongask commented 3 years ago

Dear All, I am using Pygmo to solve both single and multi-objective optimization problems. I have indicated an initial starting point for each method but it did not work when running the codes. Could you take a look at the following modified codes to see if I have put this initial starting point correctly? Thank you very much for your kind help.

Code for single objective optimization problem:

Define pygmo classs

class alge_function: def init(self): self.starting = np.random.randint(1,5,16)

def fitness(self, x):
    x = np.array(x, dtype=int) # to make integers
    x = list(x)
    vec=look_up_function(x)
    obj = scalar_function(w, vec)
    return [obj]

number of objectives

#def get_nobj(self):
#    return 18

# Integer Dimension
def get_nix(self):
    return 16

bounds [1,4]

def get_bounds(self):
    return ([1]*16, [4] * 16)

Algorithm of gradient

def gradient(self, x):
    return pg.estimate_gradient_h(lambda x: self.fitness(x), x)

use "slsqp"- sequential least squares programming algorithm

start = time.time() algo = pg.algorithm(uda = pg.mbh(pg.nlopt("slsqp"), stop = 20, perturb = .2)) algo.set_verbosity(1) # print dialog

Formulate minimization problem

pop = pg.population(prob = alge_function(), size = 200)

Solve the problem

pop = algo.evolve(pop)

2) Code for Multiobjective optimization problem:

Define pygmo classs

class alge_function: def init(self): self.starting = [1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0,1.0]

def fitness(self, x):
    x = np.array(x, dtype=int) # to make integers
    x = list(x)
    objvector = look_up_function(x)
    objv1 = reduce_function1(objvector)  # 13-objectives
    objv2 = reduce_function2(objv1)  # 8-objectives
    fs = reduce_function(objv2) # reduce to 4-objectives
    return fs

number of objectives

def get_nobj(self):
    return 4

# Integer Dimension
def get_nix(self):
    return 16

bounds [1,4]

def get_bounds(self):
    return ([1]*16, [4] * 16)

Formulate the problem and take populations (budgets)

pro = pg.problem(alge_function())

create random population of 10 initial

pop = pg.population(pro, size=200)

Methods: (nspso)Non-dominated Sorting Particle Swarm Optimization

algo = pg.algorithm(pg.nspso(gen=1000)) #gen: number of generations algo.set_verbosity(100) # print dialog

solve problem

pop = algo.evolve(pop) fits, vectors = pop.get_f(), pop.get_x() print(pro)

nirmalsnair commented 3 years ago

For an individual, the initial solution is set as follows:

prob = pg.problem(Search())
pop = pg.population(prob=prob, size=1)
pop.set_x(0, init_val)
pop = algo.evolve(pop)

See the documentation for more details.

bluescarni commented 3 years ago

@chuongask as @nsn88 mentions, please make sure to read the tutorials in the documentation. If something is unclear, feel free to come over to the gitter channel to ask general questions:

https://gitter.im/pagmo2/Lobby

We are aware that the introductory part of the documentation needs to be improved, it is on our todo list.

chuongask commented 3 years ago

Thank you for your reply.

pop.set_x(x0, init_val) is for a 1-dimension variable. How can I set x0 for a given vector x0=(1,2,3)?

nirmalsnair commented 3 years ago

Sorry, you are mistaken.

Here init_val is the decision vector, which has the same length as the problem dimension.

For example. if your problem has three variables, and the lower and upper bounds are [0.0, 0.0, 0.0] and [1.0, 1.0, 1.0] respectively, you can initialize your first individual as:

init_val = [0.1, 0.4, 0.3]
pop.set_x(0, init_val)

init_val could be a list or a Numpy array.

chuongask commented 3 years ago

I see! Thank you very much!