How can you achieve the same result using Keras's implementation? Can I achieve the same using NormalizationLayer? I could not figure out how but I tried the following:
X_norm = utils.normalize(X)
model = Sequential()
model.add(Dense(20, input_dim=12, kernel_initializer='normal', activation='relu'))
model.add(Dense(1, kernel_initializer='normal'))
# Compile model
model.compile(loss='mean_squared_error', optimizer='adam', metrics=['accuracy'])
# Fit the model
history = model.fit(X_norm.T, Y, validation_split=10, epochs=100, batch_size=15, verbose=0)
This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed.
I am very new to Keras. That said, I am trying to avoid using Scikit's pipeline code since it is relatively slower than using the keras network.
How can you achieve the same result using Keras's implementation? Can I achieve the same using NormalizationLayer? I could not figure out how but I tried the following:
I am using Keras 2.0.4 with Tensorflow 0.12.1.