automl / Auto-PyTorch

Automatic architecture search and hyperparameter optimization for PyTorch
Apache License 2.0
2.37k stars 287 forks source link

Set batch size? #10

Closed ricpruss closed 4 years ago

ricpruss commented 5 years ago

In config/configspace there are files with a more readable parameters configspace txt version that the ones in core/presets and it reads as if you can set batch size using it. How do you use that format of the one expected by config_preset= in creating AutoNet classes? e.g. config/configspace/tiny_cs.txt CreateDataLoader batch_size [125] InitializationSelector initializer:initialize_bias ["No"] Thanks, Ric

urbanmatthias commented 5 years ago

Hi,

let me quickly explain how you configure Auto-PyTorch.

(1) Most importantly, you can configure it in the constructor or the fit() method by passing keyword arguments: autoPyTorch = AutoNetClassification(log_level='info', max_runtime=300, min_budget=30, max_budget=90)

Some of these configs affect the search space, by dis/enambling components: AutoNetClassification(networks=["resnet", "shapedresnet", "mlpnet", "shapedmlpnet"])

But not to a fine-grained level.

(2) You can configure all ranges by passing an HyperparameterSearchSpaceUpdates object:

from autoPyTorch import HyperparameterSearchSpaceUpdates
search_space_updates = HyperparameterSearchSpaceUpdates()

search_space_updates.append(node_name="NetworkSelector",
                            hyperparameter="shapedresnet:activation",
                            value_range=["relu", "sigmoid"])
search_space_updates.append(node_name="NetworkSelector",
                            hyperparameter="shapedresnet:blocks_per_group",
                            value_range=[2,5],
                            log=False)
autoPyTorch = AutoNetClassification(hyperparameter_search_space_updates=search_space_updates)

You can make hyperparameters constant by passing lists with only one item to value_range. Note that BOHB currently will not filter hyperparameters that are constant. The model will not be build earlier. We are working on this.

Presets give default settings for (1) that you can overwrite in fit() or the constructor. tiny_cs already passes an HyperparameterSearchSpaceUpdates object to Auto-PyTorch. If you want to use tiny_cs with only batch_size modified, you need to pass a HyperparameterUpdates object containing: (i) The modification of the batch_size (ii) The other stuff of tiny_cs's SearchSpaceUpdates object:

CreateDataLoader batch_size [125]
InitializationSelector initializer:initialize_bias ["No"]
LearningrateSchedulerSelector cosine_annealing:T_max [10]
LearningrateSchedulerSelector cosine_annealing:T_mult [2]
NetworkSelector shapedresnet:activation ["relu"]
NetworkSelector shapedresnet:max_shake_drop_probability [0.0,0.000001]
NetworkSelector shapedresnet:resnet_shape ["brick"]
NetworkSelector shapedresnet:use_dropout [False]
NetworkSelector shapedresnet:use_shake_drop [False]
NetworkSelector shapedresnet:use_shake_shake [False]
PreprocessorSelector truncated_svd:target_dim [100]

If you want to configure Auto-PyTorch using config files refer to autoPyTorch.utils.config.ConfigFileParser.read() for (1) and autoPyTorch.utils.hyperparameter_search_space_update.parse_hyperparameter_search_space_updates() for (2)

For (2), each line corresponds to an update of a hyperparameter: NodeName hyperparameter_name value_range For a log-range of the hyperparameter: NodeName hyperparameter_name value_range log

Hope this helps,

Matthias