facebookresearch / nevergrad

A Python toolbox for performing gradient-free optimization
https://facebookresearch.github.io/nevergrad/
MIT License
3.89k stars 349 forks source link

Defining constraint vs. incorporating this info in the objective function #1580

Open ogencoglu opened 7 months ago

ogencoglu commented 7 months ago

Following you constrained optimization example in docs:

def square(x):
    return sum((x - 0.5) ** 2)

optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
# define a constraint on first variable of x:
optimizer.parametrization.register_cheap_constraint(lambda x: x[0] >= 1)

recommendation = optimizer.minimize(square, verbosity=2)
print(recommendation.value)
# >>> [1.00037625, 0.50683314]

Is this the most efficient way of performing the optimization or would it be better to incorporate the same constraint into the objective function (with an if clause that returns inf or very large value)? What are the fundamental differences of these two methods in terms of computation and the way optimization works?