facebookresearch / nevergrad

A Python toolbox for performing gradient-free optimization
https://facebookresearch.github.io/nevergrad/
MIT License
3.89k stars 349 forks source link

Issue with constrained optimization #1558

Closed samer-noureddine closed 9 months ago

samer-noureddine commented 9 months ago

Steps to reproduce

  1. Initialize a random binary matrix
  2. Set up a simple loss function to minimize, e.g. the sum of all entries in the matrix
  3. Update the entries of the binary matrix (entries constrained to {0,1}) to minimize the loss function

Observed Results

The model got stuck

Expected Results

The recommendation.value should be a binary matrix of all zeros.

Relevant Code

import numpy as np
import nevergrad as ng

# initialize random binary matrix
binary_matrix = np.random.randint(0,2,(3,3),dtype = 'int32')

# set up loss function
def allsum(binary_matrix):
    return np.sum(binary_matrix)

# minimize loss subject to constraints
optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
optimizer.parametrization.register_cheap_constraint(lambda x: ((x==0) | (x==1)).all())
recommendation = optimizer.minimize(allsum, verbosity=2)
print(recommendation.value)
teytaud commented 9 months ago

Solved by the example in https://github.com/facebookresearch/nevergrad/blob/newsmooth/examples/advbinmatrix.py, if you agree with this ?

(in your code, parametrization=2 means that you optimize 2 continuous variables, which is not the case here)

teytaud commented 9 months ago

Closed.