Open saranyaprakash2012 opened 3 years ago
You can pass whatever objects to the tuner.search(...)
function as x
and y
, for example, your files.
Then, you override the search
, in which you just wrap the passed x
and y
to generators using the hp
for batch_size, and pass the generators to the fit
function.
You can read the source code of search
function here.
You will understand how to do it.
You can pass whatever objects to the
tuner.search(...)
function asx
andy
, for example, your files. Then, you override thesearch
, in which you just wrap the passedx
andy
to generators using thehp
for batch_size, and pass the generators to thefit
function. You can read the source code ofsearch
function here. You will understand how to do it.
Thank you for the suggestion. This is definitely the way to go for me to tune the model. I have overridden the tuner.run_trial function. Is there a difference in behavior between overriding search and run_trial, esp when I want to use multiple GPUs?
I don't think there is any difference. They are just calling the keras model fit function.
I would like to use a data generator and tune epochs and batch size of a BLSTM Model. How do I pass the generator to the trial function?
Batch Generator :
Model :
Tuner :
Usual tuner call for epoch training - where full dataset is passed to the search function
tuner.search(X_train, Y_train,validation_split=0.2,verbose=1)
Tuner call when training other params where the epochs and batch size is fixed, data is passed as a generator
If I add the data generator and model. fit within the model function, will the optimizer use the model. fit output appropriately?