Open daviddavo opened 4 months ago
In the meantime, setting the seed as an hparam works
def trainable(config):
print("config:", config)
return {'loss': 0.01}
def _another_param(config):
g = np.random.default_rng(seed=config['seed'])
return config['batch_size']**2 * g.uniform()
search_alg = HyperOptSearch(
points_to_evaluate = [{
'batch_size': 3,
'another_param': 6.57,
}],
random_state_seed=42,
)
tuner = tune.Tuner(
trainable,
param_space={
'batch_size': tune.randint(2, 10),
'seed': tune.randint(0,256), # Or whatever
'another_param': tune.sample_from(_another_param),
},
tune_config=tune.TuneConfig(
search_alg=search_alg,
num_samples=5,
metric='loss',
mode='min',
)
)
tuner.fit()
What happened + What you expected to happen
I'm using a param_space that uses
tune.sample_from(...)
to get a certain hyperparameter that might be dependent on the batch size.For this, I'm using the HyperOpt search algorithm. I also want to start evaluating certain points, that I know are close to the optimum. Nevertheless, the sample_from function is executed before suggesting the configuration, and it returns a random value, therefore the desired point might not be explored.
Versions / Dependencies
Reproduction script
Issue Severity
Medium: It is a significant difficulty but I can work around it.