Closed echavezh closed 4 years ago
Can you post the code you ran as well?
Sure, I basically just copied what was posted on Reddit's post. X and y is a simple numpy array:
"""
An example training a Keras model, performing
grid search using TuneGridSearchCV.
"""
from keras.layers import Dense, Activation, Dropout
from keras.models import Sequential
from keras.utils import np_utils
from keras.wrappers.scikit_learn import KerasClassifier
from tune_sklearn import TuneGridSearchCV
def create_model(optimizer='adam',dropout=0.1):
model = keras.Sequential()
model.add(keras.layers.Dense(20,activation='relu'))
model.add(keras.layers.Dropout(dropout))
model.add(keras.layers.Dense(1,activation='sigmoid'))
model.compile(loss='binary_crossentropy',optimizer=optimizer,
metrics=['accuracy'])
return model
model = KerasClassifier(build_fn=create_model)
optimizers = ["rmsprop", "adam"]
kernel_initializer = ["glorot_uniform", "normal"]
epochs = [5, 10]
param_grid = dict(
optimizer=optimizers,
nb_epoch=epochs,
kernel_initializer=kernel_initializer)
grid = TuneGridSearchCV(estimator=model, param_grid=param_grid)
grid_result = grid.fit(X, y)
print(grid_result.best_params_)
print(grid_result.cv_results_)
Hi @echavezh, thanks for posting the issue! There are a couple things that could be causing problems, and fixing these should let it fit properly!
create_model
function is missing a parameter for kernel_initializer
which you provided in param_grid
, so when it's fitting it can't find a way to pass in the kernel_initializer
parameter you wanted to tune. It looks like you wanted to parameterize dropout
, so you can also add dropout
in your param_grid
if you want to test more hyperparameter configurations.import keras
), so you can just say Sequential()
, Dense(20, ...)
and so on :)Hope this helps, and let us know if you have more questions!
For example, you could do:
def create_model(optimizer='adam',dropout=0.1):
model = Sequential()
model.add(Dense(20,activation='relu'))
model.add(Dropout(dropout))
model.add(Dense(1,activation='sigmoid'))
model.compile(loss='binary_crossentropy',optimizer=optimizer,
metrics=['accuracy'])
return model
model = KerasClassifier(build_fn=create_model)
optimizers = ["rmsprop", "adam"]
dropout = [0.1, 0.2]
epochs = [5, 10]
param_grid = dict(
optimizer=optimizers,
nb_epoch=epochs,
dropout=dropout)
grid = TuneGridSearchCV(estimator=model, param_grid=param_grid)
grid_result = grid.fit(X_train, y_train)
Looks like this is fixed now. Going to close the issue for now
I'm getting an error as soon as I try calling the fit method: