keras-team / keras

Deep Learning for humans
http://keras.io/
Apache License 2.0
61.94k stars 19.46k forks source link

How to fix dense and time-distributed error in python? #12550

Closed deerdodo closed 3 years ago

deerdodo commented 5 years ago

I'm setting up a keras model with Conv2D and LSTM layers and i try the following code .. i am trying not to reshape the LSTM layer but it also gives me an error that the index is out of range. I have searched a lot but i couldn't figure out where is the problem or how to fix it The images which are the input to the CNN model are 128*128

This is what i have tried

num_steps = 50
lats = 128
lons = 128
features = 4
out_feats = 3

model = Sequential()
model.add(TimeDistributed(Conv2D(16, (3, 3), activation='relu', padding='same'), 
                          input_shape=(1,128, 128, 3)))

model.add(TimeDistributed(MaxPooling2D(pool_size=(2, 2))))
model.add(TimeDistributed(Conv2D(32, (3, 3), activation='relu', padding='same')))
model.add(TimeDistributed(MaxPooling2D(pool_size=(2, 2))))
model.add(TimeDistributed(Conv2D(32, (3, 3), activation='relu', padding='same')))
model.add(TimeDistributed(MaxPooling2D(pool_size=(2, 2))))
model.add(TimeDistributed(Flatten()))
model.add(LSTM(units=64, return_sequences=True))
model.add(TimeDistributed(Reshape((8, 8, 1))))
model.add(TimeDistributed(UpSampling2D((2,2))))
model.add(TimeDistributed(Conv2D(32, (3,3), activation='relu', padding='same')))
model.add(TimeDistributed(UpSampling2D((2,2))))
model.add(TimeDistributed(Conv2D(32, (3,3), activation='relu', padding='same')))
model.add(TimeDistributed(UpSampling2D((2,2))))
model.add(TimeDistributed(Conv2D(16, (3,3), activation='relu', padding='same')))
model.add(TimeDistributed(UpSampling2D((2,2))))
model.add(TimeDistributed(Conv2D(out_feats, (3,3), padding='same')))
model.compile(optimizer='adadelta', loss='categorical_crossentropy', metrics=['accuracy'])
model.summary()

X_data = np.array(X_data)
X_datatest = np.array(X_datatest)

hist=model.fit(X_data, X_data,epochs=15,batch_size=128,verbose = 2,validation_data=(X_datatest, X_datatest))

but it gives me the following error

Traceback (most recent call last): File "C:\Users\bdyssm\Desktop\Master\LSTMCNN2.py", line 107, in hist=model.fit(X_data, X_data,epochs=15,batch_size=128,verbose = 2,validation_data=(X_datatest, X_datatest)) File "C:\Users\bdyssm\AppData\Local\Programs\Python\Python35\lib\site-packages\keras\engine\training.py", line 952, in fit batch_size=batch_size) File "C:\Users\bdyssm\AppData\Local\Programs\Python\Python35\lib\site-packages\keras\engine\training.py", line 751, in _standardize_user_data exception_prefix='input') File "C:\Users\bdyssm\AppData\Local\Programs\Python\Python35\lib\site-packages\keras\engine\training_utils.py", line 128, in standardize_input_data 'with shape ' + str(data_shape)) ValueError: Error when checking input: expected time_distributed_1_input to have 5 dimensions, but got array with shape (2892, 128, 128, 3)

This is the model summary

Capture1
tranvohuy commented 5 years ago

It seems that the shape of X_data is (2892,128,128,3), that is (the number of samples, height, width, number of channels).

So in the first layer, you should type input_shape = (None, 128, 128,3).

If you want to keep input_shape = (1,128, 128,3), equivalently, input=(None, 1,128,128,3), that means the number one 1 has some meaning, you have to change the shape of X_data.

That is my opinion.