CMA-ES / pycma

Python implementation of CMA-ES
Other
1.08k stars 177 forks source link

1-D optimization problem #86

Closed fazaghifari closed 2 years ago

fazaghifari commented 5 years ago

Hi!

I tried one-dimensional optimization for Kriging hyperparameter using your code and suddenly this message appeared.

ValueError: optimization in 1-D is not supported (code was never tested)

Is this means that CMA-ES cannot be applied for one-dimensional problem? Or something's wrong in my input?

Thanks.

nikohansen commented 5 years ago

This implementation of CMA-ES cannot be directly applied to a 1-D problem. (Remark that covariances do not exist in 1-D, hence the notion of a covariance matrix is moot here. In dimension 1, CMA-ES is just a simple evolution strategy with step-size adaptation.)

A simple and quick workaround is to define a 2-D objective function wrapper and only use the first variable in the function. Setting option CMA_on=0 makes also sense in this case, in particular for longer runs. It's somewhat suboptimal in terms of quick convergence, but it should work reasonably well and be robust.

import cma

def f(x):  # a 1-D function
    return x**2
x0 = 1
es = cma.CMAEvolutionStrategy(2 * [x0], 1, {'CMA_on':0})
es.optimize(lambda x: f(x[0]))  # on the fly 2-D -> 1-D wrapper
es.logger.plot(xsemilog=True)
Output (click to expand) ``` covariance matrix adaptation turned off (3_w,6)-aCMA-ES (mu_w=2.0,w_1=63%) in dimension 2 (seed=1065710, Fri Jun 12 20:40:00 2020) Iterat #Fevals function value axis ratio sigma min&max std t[m:s] 1 6 3.308809299588352e-02 1.0e+00 9.97e-01 1e+00 1e+00 0:00.0 2 12 3.230650188704424e-04 1.0e+00 8.45e-01 8e-01 8e-01 0:00.0 3 18 1.464418694469716e-02 1.0e+00 9.35e-01 9e-01 9e-01 0:00.0 100 600 4.263981258485893e-11 1.0e+00 1.52e-05 2e-05 2e-05 0:00.1 125 750 1.664438548029092e-13 1.0e+00 1.59e-06 2e-06 2e-06 0:00.1 ``` Screen Shot 2020-06-12 at 20 45 10
ZhangAllen98 commented 1 year ago

For the 1-D problem, if f(x) has some constraints, so I used cfun = cma.ConstrainedFitnessAL(f, f_constraint, find_feasible_first=True) to define the problem, after using es.optimize with lambda function and turned off covariance matrix adaptation, I used res_x = cfun.find_feasible(es) to get the feasible result, but it will get the error something like ValueError: operands could not be broadcast together with shapes (3,) (6,). How should I do to eliminate the error. In my case, it seems that using es.result.xfavorite does not always satisfy the constraints completely.

nikohansen commented 1 year ago

To understand why this error is happening it would be really useful see the calling code, in particular also the line where it breaks and the full error message.

Otherwise, this seems like expected behavior: find_feasible returns a feasible solution but it does not guaranty that xfavorite is feasible.

Pehnny commented 1 year ago

I have a question reguarding 1D problems with CMAES. I'm currently working on my Master's thesis at uni and I'm asked to try to optimize the problem with CMAES. The problem looks like this :

def f(x, y, z) -> float :
    """Do stuff"""
    return something

def constraint(x, y) -> float :
    return 1 - x - y

With x and y inside [0,1]. So the problem is 2D. However, I'm asked to try setting y = 0. Of course, it raised the error telling me 1D is not supported. In order to trick CMAES I ask him for both x and y then set every y values to 0. I was wondering if it is ok or not because the trick you came with doesn't use y but keeps it the same value. To make things clearer, the code is :

import pickle
from cma import CMAEvolutionStrategy as CMAES

solver : CMAES = CMAES(parameters["initial"], parameters["sigma"], parameters["options"])
population : list = solver.ask(parameters["population"])
for individual in population :
    individual[-1] = 0.
"""Creates folders for parallel fitness evaluation and stuff"""

"""Loads previous population and fitness"""
with open(solver_file, "rb") as file :
    solver : CMAES = pickle.load(file)
solver.tell(previous_population, fitness)
if not solver.stop() :
    population = solver.ask(parameters["population"])
    for individual in population :
        individual[-1] = 0.
    """Save history and stuff"""

With parameters looking like that :

{
    "generation" : 50,
    "population" : 50,
    "sigma" : 0.05,
    "initial" : [
        0.275, 0.00
    ],
    "options" : {
        "bounds" : [
            [0.05, null],
            [0.50, null]
        ]
    }
}
nikohansen commented 1 year ago

Sorry, I don't understand this code. In particular, I don't see to which value previous_population is set, which may be crucial for the behavior.

Generally, just ignoring the parameter value within the fitness function is a possible approach here which should usually work reasonably well. Manipulating solutions (by setting some coordinate) is generally not recommended as it will often have unexpected and undesirable side effects.

Pehnny commented 1 year ago

Sorry, I don't understand this code. In particular, I don't see to which value previous_population is set, which may be crucial for the behavior.

To clarify, every elements of population are saved in different folders where the fitnesses are evaluated. Then, previous_population gets the values back to update CMAES. However, before sending the values I change the last element of each values by 0. So when previous_population get them back, the last value of each individual is not the same CMAES generated.

Anyway, if you think this is a bad manipulation I'm updating the code to change the last element inside the objective function.