Epistimio / hydra_orion_sweeper

Hydra Sweeper Plugin for Orion
7 stars 2 forks source link

Hierarchical structures not supported #3

Closed jerpint closed 2 years ago

jerpint commented 2 years ago

Is there a way to enable support for hierarchical structures in the plugin?

The plugin only overrides variables that are top-level in the config, and breaks as soon as they are hieararchical. For example:

config.yaml:

defaults:
  - override hydra/sweeper: orion

hydra:
  sweeper:
    orion:
      name: 'experiment'
      version: '1'

    algorithm:
      type: random
      config:
        seed: 1

    worker:
      n_workers: -1
      max_broken: 3
      max_trials: 100

    storage:
      type: pickledb
      host: 'database.pkl'

    # default parametrization of the search space
    parametrization:
      optimizer:
        name: "choices(['Adam', 'SGD'])"
        lr: "uniform(0, 1)"
      dropout: "uniform(0, 1)"
      batch_size: "uniform(4, 16, discrete=True)"

optimizer:
  lr: 0.01
  name: 'Adam'
dropout: 0.6
batch_size: 8

# if true, simulate a failure by raising an exception
error: false
return_type: float

my_app.py:

import logging

import hydra
from omegaconf import DictConfig

log = logging.getLogger(__name__)

@hydra.main(config_path=".", config_name="config", version_base="1.1")
def dummy_training(cfg: DictConfig) -> float:
    """A dummy function to minimize
    Minimum is 0.0 at:
    lr = 0.12, dropout=0.33, opt=Adam, batch_size=4
    """
    do = cfg.dropout
    bs = cfg.batch_size
    lr = cfg.optimizer.lr
    opt =  cfg.optimizer.name
    out = float(
        abs(do - 0.33) + int(opt == "Adam") + abs(lr - 0.12) + abs(bs - 4)
    )
    log.info(
        f"dummy_training(dropout={do:.3f}, lr={lr:.3f}, opt={opt}, batch_size={bs}) = {out:.3f}",
    )
    if cfg.error:
        raise RuntimeError("cfg.error is True")

    if cfg.return_type == "float":
        return out

    if cfg.return_type == "dict":
        return dict(name="objective", type="objective", value=out)

    if cfg.return_type == "list":
        return [dict(name="objective", type="objective", value=out)]

    if cfg.return_type == "none":
        return None

if __name__ == "__main__":
    dummy_training()

OUTPUT:

[2022-07-23 11:14:11,353][HYDRA] Orion Optimizer {'type': 'random', 'config': {'seed': 1}}
[2022-07-23 11:14:11,353][HYDRA] with parametrization {'batch_size': 'uniform(4, 16, discrete=True)', 'dropout': 'uniform(0, 1)'}
no viable alternative at input '{'name''
See https://hydra.cc/docs/next/advanced/override_grammar/basic for details

Set the environment variable HYDRA_FULL_ERROR=1 for a complete stack trace.
bouthilx commented 2 years ago

Hi @jerpint ! @Delaunay is working on a fix in PR #4.