facebookresearch / hydra

Hydra is a framework for elegantly configuring complex applications
https://hydra.cc
MIT License
8.83k stars 637 forks source link

[Feature Request] Allow complete group override while Inheriting other Configs #2993

Open qsh-zh opened 3 days ago

qsh-zh commented 3 days ago

🚀 Feature Request

When running multiple experiments with hierarchical configurations, there's a need to completely override specific config groups (like optimizer settings) while preserving other non-group configurations. Currently, Hydra merges all configurations, which can lead to unwanted parameter inheritance.

Here is concrete example

demo/train.py
---
import hydra
from omegaconf import DictConfig, OmegaConf

@hydra.main(version_base=None, config_path="conf", config_name="config")
def main(cfg: DictConfig) -> None:
    print("\n=== Configuration Details ===")
    print("\nFull config:")
    print(OmegaConf.to_yaml(cfg))

if __name__ == "__main__":
    main()

---
demo/conf/config.yaml
---
# @package _global_

defaults:
  - _self_
  - optim: null

  - experiment: null

---
demo/conf/optim/adam.yaml
---
lr: 0.001
betas: [0.9, 0.999]
eps: 1e-8
weight_decay: 0.0

---
demo/conf/optim/sgd.yaml
---
lr: 0.01
momentum: 0.9

---
demo/conf/experiment/first.yaml
---
# @package _global_

defaults:
  - override /optim: adam

optim:
  lr: 0.002  # override default lr
  weight_decay: 0.01
  eps: 1e-4
  extra_param: "this_should_be_deleted_in_second_experiment"

other_important_param: "this_should_be_not_deleted_in_second_experiment"

---
demo/conf/experiment/second.yaml
---
# @package _global_

defaults:
  - /experiment/first
  - override /optim: sgd

optim:
  lr: 0.05  # Only keep SGD-specific params
  momentum: 0.9

---

If I run first experiment

# python train.py experiment=first
optim:
  lr: 0.002
  betas:
  - 0.9
  - 0.999
  eps: 0.0001
  weight_decay: 0.01
  extra_param: this_should_be_deleted_in_second_experiment
other_important_param: this_should_be_not_deleted_in_second_experiment
If I run second experiment
# python train.py experiment=second
optim:
  lr: 0.05
  momentum: 0.9
  weight_decay: 0.01
  eps: 0.0001
  extra_param: this_should_be_deleted_in_second_experiment
other_important_param: this_should_be_not_deleted_in_second_experiment

what do I expected

  1. in the second experiment, we keep changes other_important_param
  2. but we can totally override optim from first experiment, which means keys extra_param and weight_decay and eps should be removed.
# python train.py experiment=second
optim:
  lr: 0.05
  momentum: 0.9
other_important_param: this_should_be_not_deleted_in_second_experiment

Motivation

See above example

Pitch

Are you willing to open a pull request? (See CONTRIBUTING)

Additional context

Add any other context or screenshots about the feature request here.

qsh-zh commented 1 day ago

related discussion: https://github.com/facebookresearch/hydra/issues/2956#issuecomment-2495085448

@jesszzzz @omry @Jasha10