anyoptimization / pymoo

NSGA2, NSGA3, R-NSGA3, MOEAD, Genetic Algorithms (GA), Differential Evolution (DE), CMAES, PSO
https://pymoo.org
Apache License 2.0
2.28k stars 390 forks source link

Mixed-Variable Multi-Objective MixedVariableGA no pareto front #464

Closed corsair20141 closed 11 months ago

corsair20141 commented 1 year ago

When using the MixedVariableGA algorithm, no pareto front is created. Additionally, if you do res.X and res.F there is only one output (i.e. the pareto set is not being saved, there should be a population of 50).

Code is below. From what I can tell it is similar setup to the MixedVariableGA sample problem. Removing constraints does not fix the issue.

# ---- User Optimum Selection Controls
weights = np.array([0.5, 0.5]) # weights for pseudo-weighted optimum selection, [min(mass),min(deflection)]

# ---- User Algorithm Controls 
algorithm_selection = "MixedVariableGA"
n_gen = 100 # number of generations to run

class Ibeam(ElementwiseProblem):
    def __init__(self,**kwargs):      
        vars = {
            "BeamHeight" : Integer(bounds=(10.,80.)),
            "FlangeWidth": Real(bounds=(10.,60.)),
            "WebThickness": Real(bounds=(1.,5.)),
            "FlangeThickness": Real(bounds=(1.,5.)),
            "Material": Choice(options=["al_7075","ss_303"])
            }
        super().__init__(vars=vars, n_obj = 2, n_ieq_constr=3, **kwargs)

    def _evaluate(self, x, out, *args, **kwargs):
        #print("f_eval")

        # User Inputs
        Length = 4 # m
        Fx = 6.25e3 # N
        Fy = 75e3 # N

        # Unpack variables for easier understanding
        BeamHeight = x["BeamHeight"]
        FlangeWidth = x["FlangeWidth"]
        WebThickness = x["WebThickness"]
        FlangeThickness = x["FlangeThickness"]
        Material = x["Material"]
        #print(f"X = {x}")

        if Material == "al_7075":
            Modulus = 71.7e9 # Pa
            Density = 2823 # kg/m^3
            stress_lim = 503e6 #Pa, yield strength (limit)
        elif Material == "ss_303":
            Modulus = 193e9 # Pa
            Density = 8000 # kg/m^3
            stress_lim = 415e6 #Pa, yield strength (limit)

        # Calculations
        hw = BeamHeight-2*FlangeThickness
        ix = WebThickness*(hw**3)/12 +2*(FlangeWidth*(FlangeThickness**3)/12+FlangeWidth*FlangeThickness*((BeamHeight-FlangeThickness)/2)**2)
        iy = hw*(WebThickness**3)/12+2*FlangeThickness*(FlangeWidth**3)/12
        dx = Fx*(Length**2)*(3*Length-Length)/(6*Modulus*iy)
        sfx = Fx*Length*(FlangeWidth/2)/iy
        dy = Fy*(Length**2)*(3*Length-Length)/(6*Modulus*ix)
        sfy = Fy*Length*(BeamHeight/2)/ix
        area = hw*WebThickness+2*FlangeThickness*FlangeWidth

        # Objective Functions (minimize)
        mass = area*Length*Density

        target_deflection = .014/12 # in
        deflection = abs(dy - target_deflection)

        # Constraints
        stress = abs(sfx)+abs(sfy) - stress_lim # stress < 12.8 MPa
        # web thickness / flange thickness lower and upper limits
        web_flange_thick_ratio_min = 0.5 - WebThickness/FlangeThickness # x[2] = WebThickness
        web_flange_thick_ratio_max = (WebThickness/FlangeThickness) - 2.0 # x[3] = FlangeThickness

        # Construct Outputs
        out["F"] = [mass,deflection]
        out["G"] = [stress,web_flange_thick_ratio_min,web_flange_thick_ratio_max]

problem = Ibeam()

# Initialize plot settings
sns.set(rc = {"figure.dpi":300, 'savefig.dpi':300, 'figure.figsize':(7,7)})
sns.set_theme()
sns.set_style("whitegrid",{'axes.linewidth': 2, 'axes.edgecolor':'black'})
#colors = ['red','blue','green','orange']
cmap = plt.cm.get_cmap('tab10')
colors = [cmap(i) for i in range(cmap.N)]

# Run Optimization(s)
if isinstance(algorithm_selection, list) == False:
    algorithm_selection = [algorithm_selection]

X_collector = []; F_collector = []
for i in range(len(algorithm_selection)):
    color1 = colors[i]
    # ---- Setup Solver Algorithm
    # if algorithm_selection[i] == "NSGA2":
    #     algorithm = NSGA2(
    #         pop_size=pop_size,
    #         n_offsprings=n_offsprings,
    #         sampling=sampling,
    #         crossover=crossover,
    #         mutation=mutation,
    #         eliminate_duplicates=eliminate_duplicates
    #     )

    if algorithm_selection[i] == "MixedVariableGA":
        algorithm = MixedVariableGA(
            pop_size=pop_size,
            survival=RankAndCrowdingSurvival()
        )

    if algorithm_selection[i] == "UNSGA3":
        ref_dirs = np.ones([1,problem.n_obj])
        algorithm = UNSGA3(
            ref_dirs,
            pop_size=pop_size,
            n_offsprings=n_offsprings,
            sampling=sampling,
            crossover=crossover,
            mutation=mutation,
            eliminate_duplicates=eliminate_duplicates
        )

    if algorithm_selection[i]  == "CTAEA":    
        ref_dirs = get_reference_directions("das-dennis", 2, n_partitions=pop_size-1)
        algorithm = CTAEA(ref_dirs=ref_dirs)

    # ---- Define Termination Criteria
    termination = get_termination("n_gen", n_gen)

    # ---- Minimize the PROBLEM using the ALGORITHM and TERMINATION CRITERIA
    res = minimize(problem,
               algorithm,
               termination,
               seed=1,
               save_history=True,
               verbose=True)

    all_pop = Population()
    for algorithm in res.history:
        all_pop = Population.merge(all_pop, algorithm.off)
    allpop_tmp = [list(d.values()) for d in all_pop.get("X")] # extract dictionary values, store back into list of lists (design variables). Note: Numpy Array can't store mixed float and str I think?
    df_history = pd.DataFrame(allpop_tmp, columns=[f"X{i+1}" for i in range(problem.n_var)])
    df_history["F1"] = all_pop.get("F")[:,0]
    df_history["F2"] = all_pop.get("F")[:,1]
    #df_history["Web_Flange_thick_Ratio"] = all_pop.get("X")[:,2]/all_pop.get("X")[:,3]
    df_history["Web_Flange_thick_Ratio"] = df_history["X3"]/df_history["X4"]

    X = res.X
    F = res.F

    X_collector.append(X)
    F_collector.append(F)
jacktang commented 1 year ago

@corsair20141 try to increase n_gen in termination from 100 to 1000, you will get 4 solutions.(they are very close)

corsair20141 commented 1 year ago

Why am I only getting 1 in the first place though? Is there really only 1 non-dominated solution? Typically, with homogeneous variable NSGA-2, you get a pareto front that is the size of your population (50)

jacktang commented 1 year ago

Hello @corsair20141, optimization problem involves a lot of domain knowledge. I am not physicist and don't understand your beam problem deeply, so I can't make sure only one non-dominated solution for your problem. The final solution:

[{'BeamHeight': 10, 'FlangeWidth': 10.0, 'WebThickness': 1.0000000000000002, 'FlangeThickness': 1.0, 'Material': 'al_7075'}
 {'BeamHeight': 10, 'FlangeWidth': 10.0, 'WebThickness': 1.0, 'FlangeThickness': 1.0, 'Material': 'al_7075'}]

If you doubt it, re-check the constraints and objective functions.

Typically, with homogeneous variable NSGA-2, you get a pareto front that is the size of your population (50)

You're right. if the the size of non-dominated solutions is larger than 50, pymoo only returns 50.

corsair20141 commented 1 year ago

Here is a new problem that has shown up. It seems that the "Choice" method does not work properly on strings; when I have the Choice options as "Material": Choice(options=["al_7075","ss_303"]) sometimes the "choice" becomes "al_707" which is a partial of one of the string options but not an actual option. This causes errors and may be related to why I am only getting a single survivor in the pareto set, not sure. But this bug is also present in the mixed-variable GA sample code in the documentation (provided below):

from pymoo.core.problem import ElementwiseProblem
from pymoo.core.variable import Real, Integer, Choice, Binary
from pymoo.core.mixed import MixedVariableGA
from pymoo.visualization.scatter import Scatter
from pymoo.algorithms.moo.nsga2 import RankAndCrowdingSurvival
from pymoo.optimize import minimize
from pymoo.core.population import Population

import pandas as pd

class MultiObjectiveMixedVariableProblem(ElementwiseProblem):

    def __init__(self, **kwargs):
        vars = {
            "b": Binary(),
            "x": Choice(options=["nothing", "multiply"]),
            "y": Integer(bounds=(-2, 2)),
            "z": Real(bounds=(-5, 5)),
        }
        super().__init__(vars=vars, n_obj=2, n_ieq_constr=0, **kwargs)

    def _evaluate(self, X, out, *args, **kwargs):
        b, x, z, y = X["b"], X["x"], X["z"], X["y"]

        f1 = z ** 2 + y ** 2
        f2 = (z+2) ** 2 + (y-1) ** 2

        if b:
            f2 = 100 * f2

        print(f"\n{x}")

        if x == "multiply":
            f2 = 10 * f2
            print("Multiplying f2!!!")

        out["F"] = [f1, f2]

problem = MultiObjectiveMixedVariableProblem()

algorithm = MixedVariableGA(pop_size=20, survival=RankAndCrowdingSurvival())

res = minimize(problem,
               algorithm,
               ('n_gen', 50),
               seed=1,
               save_history=True,
               verbose=True)

plot = Scatter()
plot.add(problem.pareto_front(), plot_type="line", color="black", alpha=0.7)
plot.add(res.F, facecolor="none", edgecolor="red")
plot.show()
corsair20141 commented 1 year ago

Output showing "Choice":

multipl

multipl

nothing

multipl

multipl

nothing

...

Notice that "multipl" is not "multiply" and therefore the logic inside the example problem does not work.

if x == "multiply":
            f2 = 10 * f2
            print("Multiplying f2!!!")

There is a bug with the "Choice" variable type.

jacktang commented 1 year ago

Hello @corsair20141, Here is the result of your demo code:

image

And the stdout:

multiply
Multiplying f2!!!

multiply
Multiplying f2!!!

multiply
Multiplying f2!!!

multiply
Multiplying f2!!!

multiply
Multiplying f2!!!

multiply
Multiplying f2!!!

nothing

multiply
Multiplying f2!!!

multiply
...
nothing

multipl
    50 |     1000 |  1.8830697321 |  4.9642532090

And pymoo version:

import pymoo
print(pymoo.__version__)

> 0.6.0.1
corsair20141 commented 1 year ago

That is not what happens on my machine. I am also Pymoo Version 0.6.0.1

I get truncated "multipl" and therefore it does NOT trigger the conditional multiplication logic inside the function definition

...
nothing

nothing

multipl

multipl

multipl

multipl

nothing

multipl

nothing

multipl
    50 |     1000 |  1.8830697321 |  4.9642532090
print(pymoo.__version__)

0.6.0.1
corsair20141 commented 1 year ago

It appears to truncate the choices at the length of the first choice string. Output if choices are ["noth","mulitply"]:

noth

noth

mult

noth

mult

mult

noth

mult
blankjul commented 1 year ago

can you quickly check if it works with the newest version of the main branch? What numpy version are you using? Does updating it resolve the issue?

corsair20141 commented 11 months ago

The latest version of Pymoo (0.6.1) fixed the issue, thank you;.