Open themagpipers opened 9 months ago
Actually, it's not only SSANSGA2, but any surrogate model. For instance I face exactly the same problem with GPSAF-NSGA-II. I've added some debug lines in pysamoo and pymoo source codes. As I have posted above, the evaluations in the first generation is fine, which indicates that contrarily to several other people who faced a similar error message, I have well defined my objective number and constraints (the fact that my code runs fine with pymoo is also such an indicator).
I also mention that passing from pymoo to pysamo, I had to change some lines in my custom Problem class:
class MyProblem(ElementwiseProblem):
def __init__(self, **kwargs):
in_vars = {
"cl_h": Real(bounds=(6.0, 16.0)),
"cr_h": Real(bounds=(0.57, 0.85)),
"g_n_div_by_2": Integer(bounds=(1, 5)),
"cl_n": Integer(bounds=(1, 11)),
"g_l": Real(bounds=(0.1, 0.3)),
}
super().__init__(vars = in_vars, n_obj = 2, n_ieq_constr = 2, **kwargs)
to
class MyProblem(ElementwiseProblem):
def __init__(self, **kwargs):
in_vars = {
"cl_h": Real(),
"cr_h": Real(),
"g_n_div_by_2": Integer(),
"cl_n": Integer(),
"g_l": Real(),
}
super().__init__(vars = in_vars, n_obj = 2, n_ieq_constr = 2, xl=np.array([0.06, 0.0057, 1, 10, 0.01]), xu=np.array([0.16, 0.0085, 5, 110, 0.03]), **kwargs)
otherwise I would get an error, which is not surprising because pysamoo explicitely checks for xl and xu (apparently pymoo didn't). I will soon face a problem (if I can get pysamoo to work), because I will implement a variable based on the "Choice" type, and it's not bounded like ints and floats, so I don't know how to deal with this using xl and xu.
I am still lost, made very little progress... I don't know whether my code is faulty or if pysamoo has a bug, in which case I'd like to bypass it. Thanks for any help.
pysamoo does not support mixed-variables (at least it was not direclty develoved for it).
I am sure it can be customized if the surrogate-models are able to handle this as well. However, RBF or Kriging already assume a continouus variable space.
Oh, I see... When you say mixed-variable, do you mean that pysamoo doesn't support a mix of say, floats and ints? Only floats would work?
yes exactly. And to my knowledge there is not much research done on mixed-variables surrogates either. But I would refer to a related research field which is hyper-parameter optimization. Checkout optuna for example.
Thank you very much @blankjul , I will check Optuna out.
As a commentary, and because I don't feel a new issue to a commentary would be appropriate, I wanted to let you know that I had a great speedup by using:
eliminate_duplicates = NoDuplicateElimination()
. My code runs maybe 5 times quicker now, since there is no check to see if individuals are unique.
Even though pymoo's tutorial states to remove duplicates to get a better solution, in practice, in my case at least I believe, there's almost no way two solutions are going to be equal, because floats are varied between bounds. It is extremely unlikely that two individuals are the same, even for a pop size of over a million. Therefore the speedup and the use of NoDuplicateElimination() is very valuable.
Yes pymoo is designed to be modular and for you to easily change its behavior, e.g. like duplicate elimination. Are you working with a large population size?
Yes pymoo is designed to be modular and for you to easily change its behavior, e.g. like duplicate elimination. Are you working with a large population size?
Yes, I was working with a large population (it's not very costly to evaluate an individual).
In the meantime, I did notice that it was better to use a custom elimination class in order to remove duplicates in a different problem. What surprised me is that even with floats as parameters, I would get duplicates in the population, which I did not expect. It was a particular problem, and in this case it was better to follow the suggestion of the tutorial to remove duplicates.
I also tried Optuna, and it seems to solve my problem as I wanted, i.e. run a surrogate multi objective optimization with different types of parameters, including several constraints. So I thank you very much for your suggestion.
Hi @blankjul, in my case I've created a problem with only binary variables, and I've found that I'm encountering the same error as mentioned in the first comment of the issue (only the size varies). Additionally, I've converted the problem to contain only integer variables, yet I still encounter the same error. Is it normal to encounter the same error if I'm only using discrete variables?
The error is indicating that the problem is setting the wrong dimension. Make sure the dimensions match the definition (n_obj, n_ieq_constr, ..)
First, thank you for your response. For example, run the following code:
class MyBinaryProblem(ElementwiseProblem):
def __init__(self):
super().__init__(n_var=24, n_obj=2, n_ieq_constr=0, n_eq_constr=4, xl=0, xu=1, vtype=bool)
def _evaluate(self, x, out, *args, **kwargs):
out["F"] = [100.0, 200.0]
out["H"] = [1.0, 0.2, 0.3, 0.1]
problem = MyBinaryProblem()
algorithm = SSANSGA2(n_initial_doe=50,
n_infills=10,
surr_pop_size=100,
surr_n_gen=100)
minimize(problem, algorithm, ("n_gen", 5), seed=1, save_history=False, verbose=True)
I get the following error:
Exception: ('Problem Error: F can not be set, expected shape (100, 2) but provided (100, 24, 2)', ValueError('cannot reshape array of size 4800 into shape (100,2)'))
What am I doing wrong? Should I define the variable "sampling" in the algorithm?
It seems that the issue arises because I defined my problem using ElementwiseProblem
. When evaluating the function for the following code, it creates a new array. Subsequently, the function is evaluated in the following code while evaluating a set of solutions. This modifies the array size to (100, 24, 2), but the code expects a size of (100, 2).
Can I define a problem using ElementwiseProblem?"
Elementwise Problems don't work in pysamoo if I recall my implementation. But you can just write out the problem definition like this:
import numpy as np
from pymoo.core.problem import Problem
from pymoo.optimize import minimize
from pysamoo.algorithms.ssansga2 import SSANSGA2
class MyBinaryProblem(Problem):
def __init__(self):
super().__init__(n_var=24, n_obj=2, n_ieq_constr=0, n_eq_constr=4, xl=0, xu=1, vtype=bool)
def _evaluate(self, X, out, *args, **kwargs):
n, _ = X.shape
F = np.zeros(shape=(n, self.n_obj))
H = np.zeros(shape=(n, self.n_eq_constr))
for k in range(n):
F[k] = np.random.random(size=2)
H[k] = np.random.random(size=4)
out["F"] = F
out["H"] = H
problem = MyBinaryProblem()
algorithm = SSANSGA2(n_initial_doe=50,
n_infills=10,
surr_pop_size=100,
surr_n_gen=100)
minimize(problem, algorithm, ("n_gen", 5), seed=1, save_history=False, verbose=True)
Ok, I understand. I will try the code you provided, thank you. Regarding the use of ElementwiseProblem, I've been doing some tests and I would like to suggest the following change (in line)
n = len(X)
to
n = 1 if self.elementwise else len(X)
I believe this might fix the problem.
yes exactly. And to my knowledge there is not much research done on mixed-variables surrogates either. But I would refer to a related research field which is hyper-parameter optimization. Checkout optuna for example.
Any update to support mixed variables? Other research is being implemented but partially: https://github.com/huawei-noah/HEBO/tree/master/MCBO & https://github.com/facebook/Ax/issues/2650
So far there are no plans to also supported mixed-variables out of the box.
Hi, I have a multiobjective (2 objectives) optimization problem that works with pymoo, I solve a 2 constraints with 5 variables to vary (some are floats, some are ints) with a MixedVariableGA() algorithm in pymoo, and it works fine. I wanted to try the surrogate (pysamoo) approach to see the difference. My problem should be compatible with SSANSGA2 as far as I can see. My tries lead to an error I haven't been able to fix.
Running the code leads to:
Any pointer to what's possibly wrong is welcome.