Closed KananMahammadli closed 1 year ago
I'm not entirely sure what you're looking for from your description, so feel free to add more details. I imagine that you can use the TF-DF with the Keras Tuner to write a function that does what you're looking for. Here is our tutorial for using the Keras Tuner with TF-DF.
Please let us know if you have any exciting results or followup questions.
Sorry if I wasn't clear enough. The Keras Tuner example you mentioned uses one validation data for tuning; for each parameter combination, it trains the model on training data and looks at the score on given validation data, as I understand. But I want to specify how many train-validation splits the model should check. If I want to have ten folds, I should have the best parameters that have the best score as the average of 10 train-validation split scores. I put a toy example of Scikit-Learn GridSearchCV below; I am looking for a similar thing in Tensorflos Decision Forests :
gs = GridSearchCV(
estimator=model_to_train,
param_grid=parameters_for_searching,
scoring='accuracy',
cv=10
)
gs.fit(X_train, y_train)
gs.best_params_
I don't think there is a built-in way of doing this with Keras or the TF-DF tuner. It seems like you can get the Keras tuner to do this - quick googling turned up projects like this one, though I have not tried it and cannot vouch for correctness / performance / security / ...
I need to tune the parameters but do not want to find the best parameters for one split. How can I use a strategy like GridSearch from Sklearn to find the parameters with the best score, say, as average of 10-fold results? Is there any wrapper for that?