facebookresearch / nevergrad

A Python toolbox for performing gradient-free optimization
https://facebookresearch.github.io/nevergrad/
MIT License
3.9k stars 350 forks source link

When using NGOpt, Print the selected Optimizer #1507

Closed kayuksel closed 1 year ago

kayuksel commented 1 year ago

Hello everyone, is there an easy way of printing the name of the utilized optimizer for the problem, when using any of the NGOpt?

nhansendev commented 1 year ago

Try this:

import nevergrad as ng

optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
print(optimizer.optim)

Result:

Instance of MetaModel(parametrization=Array{(2,)}, budget=100, num_workers=1)
nhansendev commented 1 year ago

Note that some of the multi-choice optimizers (e.g. CMandAS3) have multiple optimizers, which are accessed via "optims" (plural) instead.

kayuksel commented 1 year ago

Thank you very much. I am currently travelling and will try those when I get back to the office.

On Mon 3. Apr 2023 at 22:06, Obliman @.***> wrote:

Note that some of the multi-choice optimizers (e.g. CMandAS3) have multiple optimizers, which are accessed via "optims" (plural) instead.

— Reply to this email directly, view it on GitHub https://github.com/facebookresearch/nevergrad/issues/1507#issuecomment-1494831710, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALF7SP2EIJ3TZX3RZWZRMETW7MNSJANCNFSM6AAAAAAWPACLWE . You are receiving this because you authored the thread.Message ID: @.***>

--

https://aixplain.com

Kamer Ali YUKSEL

Sr. Principal AI Architect

https://www.linkedin.com/in/kyuksel/

kayuksel commented 1 year ago

Hello again,

It gives me the following error: '_DE' object has no attribute 'optim'

My code is as follows. I am optimizing using every optimizer available.

I guess that optim only exists for combined optimizers like NGOpt?

for optimizer_name in [x for x in dir(ng.optimization.optimizerlib) if x[0].isupper() and not x.startswith("_")]:
    try:
        if optimizer_name in ['BO', 'BOSplit', 'EDA', 'MEDA', 'HyperOpt', 'MPCEDA', 'PCEDA']: continue
        best_reward = 1000000000000000000
        optimizer_class = getattr(ng.optimization.optimizerlib, optimizer_name)
        optimizer = optimizer_class(parametrization=ng.p.Array(shape=(len(assets,),)), budget=60)
        x = optimizer.minimize(train).value
        print('%s %f' % (optimizer_name, test(x)))
    except Exception as e:
        print(e)
        continue
nhansendev commented 1 year ago

You are correct. "optim" and "optims" only work for combined optimizers. For all optimizers you can use optimizer.name:

import nevergrad as ng

optimizer = ng.optimizers.Cobyla(parametrization=2, budget=100)
print(optimizer.name)

Result: Cobyla

Note that this will return "NGOpt" for NGOpt instead of the optimizer that it selects.

kayuksel commented 1 year ago

Thank you.