facebookresearch / nevergrad

A Python toolbox for performing gradient-free optimization
https://facebookresearch.github.io/nevergrad/
MIT License
3.94k stars 353 forks source link

MetaModel Deprecation Warning: Conversion of an array with ndim > 0 to a scalar #1612

Closed BrianSchiller closed 1 week ago

BrianSchiller commented 3 months ago

Steps to reproduce

  1. Create an optimizer that uses MetaModel, such as MetaModelOnePlusOne and give it a budget of ~50
  2. Run the model with a configuration that calls the function _learn_on_kbest
  3. _learn_on_kbest then throws DeprecationWarnings

Observed Results

Expected Results

Relevant Code

nevergrad\optimization\metamodel.py:83:

minimum = optimizer.minimize(
                        lambda x: float(model.predict(polynomial_features.fit_transform(x[None, :])))
                    ).value

(This part only has a single element being returned and therefor should work fine)

nevergrad\optimization\metamodel.py:91:

if float(model.predict(polynomial_features.fit_transform(minimum[None, :]))) > y[0]:

(This part has multiple values being returned)

The returned array can look like this:

[0.8339156  0.00786092 0.85692649 0.8821352  0.9505408  0.9708708
 0.11402067 0.98607812 1.00224294 0.98692334]
teytaud commented 1 week ago

Hello. Sorry for the delay. This part of the code has moved a lot in the mean time as the MetaModel part has been improved significantly. On the current version (github/main) I do not get any deprecation warning. If you share the exact code with which you get this, I will double check.

teytaud commented 1 week ago

As seemingly the problem does not exist anymore I close the issue. In case you still meet a problem with the current github/main, please reopen + post the exact code and I'll investigate.