mle-infrastructure / mle-toolbox

Lightweight Tool to Manage Distributed ML Experiments 🛠
https://mle-infrastructure.github.io/mle_toolbox/toolbox/
MIT License
3 stars 1 forks source link

`net_config` integration in hypersearch #21

Closed RobertTLange closed 3 years ago

RobertTLange commented 3 years ago

Currently we can only search over variables defined in train_config. If you want to search over different architectures you need to do so 'implicitly' aka write your own hacky function ala:

net_config.layer1.hidden_size = train_config.hidden_size

Can we do better than that? What are the options? Probably would require a unified way of how to build/define the architecture of a network. Would that be too restricting?

RobertTLange commented 3 years ago

While at it - change all net_config to model_config.

RobertTLange commented 3 years ago

Maybe the easiest way to do this is to add a config prefix (train:l_rate and network:num_hidden) to the variables in the search experiment .yaml file. This could be incorporated by modifying gen_hyperparam_configs in BaseHyperOptimisation to include:

https://github.com/RobertTLange/mle-toolbox/blob/b0ae7605de93d06fe1ed64b2a5473ad0ac1d6315/mle_toolbox/hyperopt/hyperopt_base.py#L168-L181

# TODO: Differentiate between network and train config variable?!
for param_name, param_value in proposals[s_id].items():
    config_id, param = param_name.split(":")
    if config_id == "train":
        sample_config.train_config[param] = param_value
    elif config_id == "network":
        sample_config.network_config[param] = param_value
RobertTLange commented 3 years ago

Will be addressed in next PBT PR 74c7616.