maxpumperla / hyperas

Keras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization
http://maxpumperla.com/hyperas/
MIT License
2.18k stars 318 forks source link

Hyperas training/validation results are not the same as without hyperas #205

Open bluesheeva opened 5 years ago

bluesheeva commented 5 years ago

I have just started using Hyperas for my keras optimisation. I created a CNN1D model and train this model on a dataset which achieved 30/40 accuracy at the first few epochs and obtained about 64% accuracy after few hundreds of epochs. Tried exactly the same model and parameters and with the same dataset on hyperas but could not get the same accuracy scores on the first epochs or after hundred of epochs. the accuracy stays at 18% and both training and validation loss are static. I had my colleague to test and review my code and his conclusion is the same.

Also, I run the code in Jupyter notebook and not sure why I have to restart the kernel for changes to the create_model function to take effect?

Appreciate any help given on this. Here is my code without the data() function: ` def create_model_test(x_train, y_train, x_test, y_test): from keras import backend backend.clear_session()

kernel_size = 3
model = Sequential()
model.add(Conv1D(128, kernel_size, padding='same', activation='relu', input_shape=(223,1))) 
                ## kernel_regularizer=regularizers.l2(0.01)))
model.add(MaxPooling1D(pool_size=(2)))
model.add(Dropout(0.2))

model.add(Conv1D(128, kernel_size, padding='same', activation='relu'))
model.add(MaxPooling1D(pool_size=(2)))
model.add(Dropout(0.2))

model.add(Conv1D(64, kernel_size, padding='same', activation='relu'))
                 #kernel_regularizer=regularizers.l2(0.01)))
model.add(MaxPooling1D(pool_size=(1)))
model.add(Dropout(0.2))

model.add(Conv1D(64, kernel_size, padding='same',activation='relu'))
                 #kernel_regularizer=regularizers.l2(0.01)))
model.add(MaxPooling1D(pool_size=(1)))
model.add(Dropout(0.2))

model.add(Flatten())
model.add(Dense(32, activation='relu'))
                #kernel_regularizer=regularizers.l2(0.01)))
model.add(Dropout(0.5))

model.add(Dense(7))
model.add(Activation('softmax'))

print(model.summary())

opt = keras.optimizers.adam(lr=0.0001, decay=1e-6)

model.compile(optimizer = opt,#optimizer={{choice(['adam'])}},
              loss='categorical_crossentropy',
              metrics=['accuracy'])

callbacks = [EarlyStopping(monitor='val_loss', patience=100)]
result = model.fit(x_train, y_train,
                   epochs=5000,
                   batch_size=32,
                   callbacks=callbacks,
                   validation_split=0.05)

#get the highest validation accuracy of the training epochs
validation_acc = np.amax(result.history['val_acc']) 
print('Best validation acc of epoch:', validation_acc)
return {'loss': -validation_acc, 'status': STATUS_OK, 'model': model}

if name == "main": best_run, best_model = optim.minimize(model=create_model_test, data=data, algo=tpe.suggest, max_evals=15, trials=Trials(), notebook_name='Emotion_CNN_MANY_FEATURES')

X_train, Y_train, X_test, Y_test = data()
print("Evaluation of best performing model:")
print(best_model.evaluate(X_test, Y_test))
print("Best performing model chosen hyper-parameters:")
print(best_run)

`