I was testing hyperparameter optimization function. raytune + hyperopt works fine. raytune + optuna, however, it seems that it couldn't handle the nested distribution search space such as below.
"dropout": tune.choice([tune.choice([0.0]), tune.quniform(lower=0.05, upper=0.4, q=0.05)]),
The best config is returned as
{'train_loop_config': {'ffn_hidden_size': 700.0, 'hidden_size': 1700.0, 'ffn_num_layers': 3, 'depth': 2, 'dropout': <ray.tune.search.sample.Categorical object at 0x7fc8932f40d0>}}
while something like this is expected
{'train_loop_config': {'ffn_hidden_size': 700.0, 'hidden_size': 1700.0, 'ffn_num_layers': 3, 'depth': 2, 'dropout': 0}}
I'm inclined to comment out the relevant code for now for the v2.0 release deadline and only support random search and hyperopt search algorithm. We can keep track of this in the future.
As discussed in last week's meeting, we might be able to repeat the zeros multiple times with the other possible values in a tune.choice to bypass the issue here.
See: https://github.com/chemprop/chemprop/pull/734
I was testing hyperparameter optimization function. raytune + hyperopt works fine. raytune + optuna, however, it seems that it couldn't handle the nested distribution search space such as below.
"dropout": tune.choice([tune.choice([0.0]), tune.quniform(lower=0.05, upper=0.4, q=0.05)]), The best config is returned as
{'train_loop_config': {'ffn_hidden_size': 700.0, 'hidden_size': 1700.0, 'ffn_num_layers': 3, 'depth': 2, 'dropout': <ray.tune.search.sample.Categorical object at 0x7fc8932f40d0>}} while something like this is expected
{'train_loop_config': {'ffn_hidden_size': 700.0, 'hidden_size': 1700.0, 'ffn_num_layers': 3, 'depth': 2, 'dropout': 0}} I'm inclined to comment out the relevant code for now for the v2.0 release deadline and only support random search and hyperopt search algorithm. We can keep track of this in the future.