keras-team / keras-applications

Reference implementations of popular deep learning models.
Other
2k stars 913 forks source link

glove embeddings for keras layer #141

Open csharma opened 4 years ago

csharma commented 4 years ago

I'm creating my embedding from glove.6B file as follows

embedding_layer = Embedding(input_dim=embedding_matrix.shape[0], output_dim=embedding_matrix.shape[1], input_length=max_len, weights=[embedding_matrix], trainable=False, name='embedding_layer')

sequence_input = Input(shape=(max_len,), dtype='int32') tf.squeeze(tf.cast(sequence_input, tf.string))

embedded_sequences = embedding_layer(sequence_input)

return embedded_sequences

This gets fed to a bidirectional LSTM classification = Bidirectional(self.__rnn_layer(size_layer_1, activation, init_mode, False, dropout_layer_1, dropout_layer_1, rnn_type))(embedding)

This gives me: Model: "model_1"


Layer (type) Output Shape Param #

input_1 (InputLayer) (None, 200) 0


lambda_1 (Lambda) (None, 200, 50) 0


bidirectional_1 (Bidirection (None, 32) 6432


dense_1 (Dense) (None, 1) 33

I get the following error

You must feed a value for placeholder tensor 'lambda_1/input_2' with dtype int32 and shape [?,200] [[{{node lambda_1/input_2}}]] [[{{node _arg_bidirectional_1/keras_learning_phase_0_3}}]]

Please help.

Best regards, Cartik