Open timruhkopf opened 2 years ago
Thanks for your advice! Indeed, we are also preparing for constructing a unified search space for both neural architecture search and hyper parameter optimization. We'll consider making the interface more concise (e.g., using ConfigSpace) in the future release!
Really nice! I look forward to it.
On that note, for working with the current version, I have a related question: The Search Space tutorial is rather short, can you please elaborate your numerical list parameter, because I am not entirely sure on its purpose and usage.
Thanks for your advice!
Numerical list is a series of variables in numerical search space. E.g., you want search for the dimensions of multiple layers. You can either assign a fixed length for the list, if so, you need not provide cutPara and cutFunc. Or you can let HPO cut the list to a certain length which is dependent on other parameters (e.g. you also want to search the number of layers at the same time when you search the dimensions). You should provide those parameters’ names in curPara and the function to calculate the cut length in “cutFunc”. You can assign minValue and masValue as either a list or a number. Here is a example
{
"parameterName": "layers",
"type": "INTEGER",
"minValue": 2,
"maxValue": 4,
"scalingType": "LINEAR"
},
{
"parameterName": "dimension",
"type": "NUMERICAL_LIST",
"numericalType": "INTEGER",
"length": 4,
"cutPara": "layers",
"cutFunc": lambda x: x,
"minValue": 16,
"maxValue": 128,
"scalingType": "LOG"
}
Thinking a little bit further and investigating your code, going in the direction of Configspace would also allow you to specify your prior beliefs about the distribution of Parameters - if you would go bayesian. This can alleviate specifying the parameters distribution in the autogl.module.hpo.suggestion.
I do see however that it is convenient for you to adopt the Advisor syntax for hyperparameterspaces, as you provide a compatibility bridge and partially rely on their models.
Which ever way you go, an awesome package you got there :)
Is your feature request related to a problem? Please describe. When writing the Models into the interface, the hyperparameter space is tedious.
Describe the solution you'd like To validate that the model works appropriately under the search space, it would be nice to be able to sanity check the h.space.
Describe alternatives you've considered A concise way for declaring complex h.spaces is the Configspace package
It also allows to sample from the space and declare distributions explicit.
Additional context