facebookresearch / nevergrad

A Python toolbox for performing gradient-free optimization
https://facebookresearch.github.io/nevergrad/
MIT License
3.89k stars 349 forks source link

Unable to pickle FCMAES optimizer #1582

Open nhansendev opened 6 months ago

nhansendev commented 6 months ago

Steps to reproduce

(see code example below)

  1. Setup ParametrizedCMA as the optimizer with fcmaes=True
  2. Run optimization (full or partial)
  3. Pickling error when trying optimizer.dump

Observed Results

Mysterious error:

optimizer.dump("test.pkl")
  File "D:\Atom\VENVs\torch_env\Lib\site-packages\nevergrad\optimization\base.py", line 242, in dump
    pickle.dump(self, f)
  File "C:\Users\nate\AppData\Local\Programs\Python\Python311\Lib\multiprocessing\heap.py", line 54, in __getstate__
    assert_spawning(self)
  File "C:\Users\nate\AppData\Local\Programs\Python\Python311\Lib\multiprocessing\context.py", line 373, in assert_spawning
    raise RuntimeError(
RuntimeError: Arena objects should only be shared between processes through inheritance

Expected Results

The optimizer is saved (pickled) without error.

Relevant Code

import nevergrad as ng
import numpy as np

def simple_func(params):
    return sum(params)

ng_params = ng.p.Array(init=np.random.rand(10) - 0.5)

opt = ng.families.ParametrizedCMA(
    elitist=True, popsize=100, diagonal=False, fcmaes=True, random_init=True
)

optimizer = opt(
    parametrization=ng_params,
    budget=100,
    num_workers=1,
)
optimizer.enable_pickling() # makes no difference

rec = optimizer.minimize(simple_func).value
print(rec)

optimizer.dump("test.pkl")