keras-team / keras

Deep Learning for humans
http://keras.io/
Apache License 2.0
62.07k stars 19.48k forks source link

NN without Hidden layer #1745

Closed mina-nik closed 7 years ago

mina-nik commented 8 years ago

I want to predict sequences of number vectors based on the previous ones. I have five sequences. I considered the length of the history 100. I transformed the data to following format: As an input X I have array of n matrices, each with 100 rows and 5 columns (technically, X is a tensor with dimensions n x 100 x 5). The target y will be matrix n x 5 - for each input X_i (matrix 100 x 5) I want one corresponding row of y (with just two elements). So my input data (X) is a numpy array n x 100 x 5 and output (y) is n x 5 .

I want to implement a neural network without hidden layer (just like regression). Since I implemented my more complicated NNs in keras, for comparison with my previous result I have to implement this in keras also. My code is as follows

in_out_neurons = 5
history_length = 100
hidden_neurons = 20
model = Sequential()  
model.add(Dense(in_out_neurons, activation = "linear", input_dim = (history_length,in_out_neurons)))
model.compile(loss="mse", optimizer="rmsprop")
model.fit(X_train, y_train, batch_size=100, nb_epoch=2)

I got this error

Traceback (most recent call last): File "/home/mina/Documents/research/five cores (FF) 4/KerasLSTM.py", line 216, in main() File "/home/mina/Documents/research/five cores (FF) 4/KerasLSTM.py", line 179, in main model.add(Dense(in_out_neurons, activation = "linear", input_dim=(history_length,in_out_neurons))) File "/usr/local/lib/python2.7/dist-packages/keras/layers/core.py", line 917, in init super(Dense, self).init(kwargs) File "/usr/local/lib/python2.7/dist-packages/keras/layers/core.py", line 43, in init** self.set_input_shape((None,) + tuple(kwargs['input_shape'])) File "/usr/local/lib/python2.7/dist-packages/keras/layers/core.py", line 141, in set_input_shape self.build() File "/usr/local/lib/python2.7/dist-packages/keras/layers/core.py", line 922, in build self.W = self.init((input_dim, self.output_dim)) File "/usr/local/lib/python2.7/dist-packages/keras/initializations.py", line 39, in glorot_uniform s = np.sqrt(6. / (fan_in + fan_out)) TypeError: can only concatenate tuple (not "int") to tuple

Does anybody know the reason?

tboquet commented 8 years ago

You could use:

model.add(TimeDistributedDense(in_out_neurons, activation = "linear", input_dim = (history_length,in_out_neurons)))
model.add(Flatten())
model.add(Dense(5))
model.compile(loss="mse", optimizer="rmsprop")
model.fit(X_train, y_train, batch_size=100, nb_epoch=2)

or flatten directly you input:

history_times_out = history_length*in_out_neurons
model = Sequential()  
model.add(Dense(in_out_neurons, activation = "linear", input_dim = history_times_out))
model.add(Dense(5))
model.compile(loss="mse", optimizer="rmsprop")
model.fit(X_train, y_train, batch_size=100, nb_epoch=2)

(note that the batch size could be set to something other than 100 because it is not related to your timesteps) I didn't tested it but it should be a good hint!

If you are working with sequences, you should take a look at the recurrent layers for more sophisticated models.

mina-nik commented 8 years ago

Thank you @tboquet I have worked on sequences #1727 . It works very well. Since here the sequence is not important for me, I forgot to not consider history, so history_length is 1. Thus both input (X) and output(y) are n x 5. I used your second suggestion in the following way

 model = Sequential()  
 model.add(Dense(in_out_neurons, activation = "linear", input_dim = (in_out_neurons,)))
 model.add(Dense(5))
 model.compile(loss="mse", optimizer="rmsprop")
 model.fit(X_train, y_train, batch_size=100, nb_epoch=2)

I got following error, do you know why!

Traceback (most recent call last):
  File "/home/mina/Documents/research/five cores (FF) 4/Kerass.py", line 214, in <module>
main()
  File "/home/mina/Documents/research/five cores (FF) 4/Kerass.py", line 176, in main
model.add(Dense(in_out_neurons, activation = "linear", input_dim = (history_times_out,)))
  File "/usr/local/lib/python2.7/dist-packages/keras/layers/core.py", line 917, in __init__
super(Dense, self).__init__(**kwargs)
  File "/usr/local/lib/python2.7/dist-packages/keras/layers/core.py", line 43, in __init__
self.set_input_shape((None,) + tuple(kwargs['input_shape']))
  File "/usr/local/lib/python2.7/dist-packages/keras/layers/core.py", line 141, in set_input_shape
self.build()
  File "/usr/local/lib/python2.7/dist-packages/keras/layers/core.py", line 922, in build
self.W = self.init((input_dim, self.output_dim))
  File "/usr/local/lib/python2.7/dist-packages/keras/initializations.py", line 39, in glorot_uniform
s = np.sqrt(6. / (fan_in + fan_out))
TypeError: can only concatenate tuple (not "int") to tuple
nizhib commented 8 years ago

Both types of intput_dim and output_dim must be int (doc).

Try model.add(Dense(in_out_neurons, activation = "linear", input_dim = in_out_neurons)).

mina-nik commented 8 years ago

@enizhibitsky Thank you! You are right.

mina-nik commented 8 years ago

Dear @enizhibitsky, I think for having NN without hidden layer I should delete the second dense layer! Is it right? I think then I will have one hidden layer.

tboquet commented 8 years ago

@mininaNik oops forgot to adapt the input part. If you mean delete this layer:

model.add(Dense(5))

You won't have the desired output. If you want only one layer you should delete the first layer:

history_times_out = history_length*in_out_neurons
model = Sequential()  
model.add(Dense(5, input_dim=history_times_out))
model.compile(loss="mse", optimizer="rmsprop")
model.fit(X_train, y_train, batch_size=100, nb_epoch=2)

I'm not sure you want to do that because it's basically a linear regression with 5 outputs and you have a closed form solution for that problem. It's the same as performing 5 different linear regression, one for every output using your inputs.

mina-nik commented 8 years ago

Thank your @tboquet

You are absolutely right, I want to have just a linear regression.

tboquet commented 8 years ago

I should add that using Keras for this task wouldn't be the most efficient solution. If you want to use linear regression, you could also use scikit learn or statsmodels if you have to interpret your estimators.

mina-nik commented 8 years ago

Yes, I know and I have worked with R and Matlab for regression also. But here I want to have regression on keras for comparison purpose.