Closed mmiakashs closed 3 years ago
You can change the values of a hyperparam
to a list for any model config parameters to sweep over those list values.
For example :
hyperparam(
"model_config.visual_bert.num_labels", [2, 4], save_dir_key=lambda val: f"num_labels{val}"
),
Does that answer your question?
Thanks, @vedanuj for the explanation. I have very novice level questions: in the hyperparameters sweep, why do we need to add a prefix of _modelconfig in front of the _modelconfig hyperparameters, whereas we do not need to include the _expconfig in front of the exp_config parameters (in the above cases _batchsize and lr)? is it default behavior?
You should check this note about MMF configuration system to understand this in detail.
❓ Questions and Help
I was considering to conduct hyperparameters sweeps on slurm. I was following this tutorial: https://mmf.sh/docs/tutorials/slurm Here, it was trying to sweep the experiments related to hyper-parameters (batch size/ learning rate). However, is it possible to conduct both experiments related to hyper-parameters (listed in exp_config.yaml) as well as model related hyper-parameters (listed in model_config.yaml) using the same scripts?
For example:
Or could you please share any resource for sweeping both experiments and model-related parameters on slurm? Thanks