DEAP / deap

Distributed Evolutionary Algorithms in Python
http://deap.readthedocs.org/
GNU Lesser General Public License v3.0
5.82k stars 1.13k forks source link

When using multiprocessing, something goes wrong. #355

Closed dotcom closed 5 years ago

dotcom commented 5 years ago

In the evaluation, I tried to output the evaluation value and the individual at that time.

def func(x,y,z,w):
    return x*y + z + 4*w

train = []
for i in range(DATASIZE):
    X = []
    for j in range(4):
        X.append(random.uniform(10,100))
    X.append(random.choice([func(X[0],X[1],X[2],X[3]),0,0,0,0]))
    train.append(X)

def step_function(x):
  if x>0:
    return 1
  else:
    return 0

def sigmoid(x):
  return 1 / (1+np.exp(-x))

def mapnp(f,x):
    return np.array(list(map(f,x)))

def paramfunc(vec, mat1, mat2, mat3):
    return mapnp(step_function, np.dot(mapnp(sigmoid,np.dot(mapnp(sigmoid, np.dot(vec, mat1)), mat2)), mat3))

def listparamfunc(vec, paramlist):
    mat1 = [paramlist[0:4], paramlist[4:8], paramlist[8:12], paramlist[12:16]]
    mat2 = [paramlist[16:20], paramlist[20:24], paramlist[24:28], paramlist[28:32]]
    mat3 = paramlist[32:]
    return paramfunc(np.array(vec), np.array(mat1), np.array(mat2), np.array(mat3))

def evalind(individual):
    profit = 0
    nptrain = np.array(train)
    result = listparamfunc(nptrain[:,0:4],individual)
    for x in range(len(train)):
        if result[x] == 1:
            profit = profit + train[x][4] - 1000
    ################################################
    print(profit, individual) ############ output ############
    ################################################
    return profit,

creator.create("FitnessMax", base.Fitness, weights=(1.0,))
creator.create("Individual", list, fitness=creator.FitnessMax)

toolbox = base.Toolbox()
toolbox.register("attr_float", random.uniform, -1, 1)
toolbox.register("individual", tools.initRepeat, creator.Individual, toolbox.attr_float, 36)
toolbox.register("population", tools.initRepeat, list, toolbox.individual)

toolbox.register("evaluate", evalind)
toolbox.register("mate", tools.cxUniform, indpb=MATE_GENOM_RATE)
toolbox.register("mutate", tools.mutGaussian, indpb=MUTATE_GENOM_RATE, mu=MUTATE_MU, sigma=MUTATE_SIGMA)
toolbox.register("select", tools.selTournament, tournsize=TOURNAMENT)

def main(pop, gen=GEN):
    random.seed(time.time())
    if pop == None:
        pop = toolbox.population(n=500)
    hof = tools.HallOfFame(1)

    pop, log = algorithms.eaSimple(pop, toolbox, MATE_RATE, MUTATE_RATE, gen,halloffame=hof, verbose=True)
    return pop, log, hof

if __name__ == "__main__":

    pool = multiprocessing.Pool(11)
    toolbox.register("map", pool.map)

    pop, log, hof = main(pop=None, gen=10)

    best = tools.selBest(pop, 1)[0]
    print("best: ", best)
    print("best.fitness.values : ", best.fitness.values)
    print("evalind(best) : ", evalind(best))

This was the one with the highest rating.

531564.2498946949 : [-0.014781777922400874, -0.3787021885147608, ....

However, when I evaluated best (hof) again after completion, different value ​​cames out. (The gene is the same)

best:  [-0.014781777922400874, -0.3787021885147608, ....
best.fitness.values :  (531564.2498946949,)

evalind(best) :  (265238.98228751967,)  #     !=531564.2498946949

turn off multiprocessing

Several studies have shown that multiprocessing is affecting me. As a result, when I stopped multiprocessing, this was output correctly.

#pool = multiprocessing.Pool(11)
#toolbox.register("map", pool.map)
best.fitness.values :  (511370.0290722571,)
evalind(best) :  (511370.0290722571,)

why? Thank you.

fmder commented 5 years ago

Your evaluation function depends on a global variable train that is randomly generated. This cannot work in multiprocessing as each process will have a different train object.

dotcom commented 5 years ago

Are different objects but different values?

dotcom commented 5 years ago

It is randomly generated only once.

dotcom commented 5 years ago

Its really...

pid, and data

17576  :  [[56.37660726530297, 57.21448899086297, ...
14992  :  [[10.732069581858903, 27.533883587933413,...
fmder commented 5 years ago

Are you on Windows or Unix? On Windows It should be generated twice

dotcom commented 5 years ago

It is Windows. When I tried it on Ubuntu now, there was no problem. wow why? What kind of mechanism is it? If you know, please tell me briefly. Isn't Copy on write in windows?

fmder commented 5 years ago

Unix fork the process and copy the memory. Windows starts k processes, thus the script is entirely run in each process.

You can find more information in the multiprocessing module documentation.

Le sam. 18 mai 2019 09 h 15, dotcom notifications@github.com a écrit :

It is Windows. When I tried it on Ubuntu now, there was no problem. wow why? What kind of mechanism is it? If you know, please tell me briefly. Isn't Copy on write in windows?

— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub https://github.com/DEAP/deap/issues/355?email_source=notifications&email_token=AAHKXQSVC6KNOBH7VUPXXS3PV76PVA5CNFSM4HN2HBE2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODVWOOZY#issuecomment-493676391, or mute the thread https://github.com/notifications/unsubscribe-auth/AAHKXQTLJTT7BBPIWXPZJODPV76PVANCNFSM4HN2HBEQ .

dotcom commented 5 years ago

Thank you for your kindness.