Closed jimzer closed 3 years ago
Hi,
The Tuner currently only support tuning the hparams of the model for best metrics (e.g., accuracy)
batch_size is outside of the model tuning loop, and it's for performance instead of accuracy. Thus the existing tuning logic (depends on kerastuner library) won't be able to handle it.
(batch_size is more like a TFX pipeline configuration tuning loop instead of model hparam tuning loop)
Customize kerastuner.BaseTuner should work for batch size tuning
You'd pass an unbatched dataset to the Tuner, and in the run_trial method of the CustomTuner, you'd batch the dataset with a variable size drawn from hp
:
Thanks for your help! I will give a try to the Keras BaseTuner customization.
@jimzer ,please close this if you are satisfied with the solution provided by @1025KB.Thanks.
Hello,
I'm currently using TFX to build a pipeline on the Google AI platform with the Kubeflow engine. I have a model where the batch size is an important hyper-parameter to tune.
I would like to search this hyper-parameter in the Tuner component.
Is it even possible?
I follow the TFX example with the Penguin dataset, more precisely the tuner component implementation: found here.
The
_get_hyperparameters
function returns the sample space for the model hyperparameters (see line 139). However, the batch size to train the model is fixed and specified at the end of the tuner_fn (see line 246).Is there a way to dynamically change the batch size based on a sample from the hyper-parameter space?
Thanks for your help !