keras-team / keras

Deep Learning for humans
http://keras.io/
Apache License 2.0
61.61k stars 19.42k forks source link

I'm afraid that the Bidirectional Wrapper will not work in Keras Functional Api. #20162

Closed dauntless23 closed 2 weeks ago

dauntless23 commented 3 weeks ago
          I'm afraid that the Bidirectional Wrapper will not work in Keras Functional Api. 

Any help in this sort of thing:

main_input = Input(shape=(100,), dtype='int32', name='main_input')
x = Embedding(output_dim=512, input_dim=10000, input_length=100)(main_input)
lstm = LSTM(32)(x)
bidirectional = Bidirectional()(lstm) #how bidirectional should be instantiated?

Originally posted by @grafael in https://github.com/keras-team/keras/issues/1629#issuecomment-326342067

dauntless23 commented 3 weeks ago

I am failing to use the Bidirectional Wrapper in keras functional API

fchollet commented 2 weeks ago

It's easy. It works like this:

main_input = Input(shape=(100,), dtype='int32', name='main_input')
x = Embedding(output_dim=512, input_dim=10000, input_length=100)(main_input)
x = Bidirectional(LSTM(32))(x)
dauntless23 commented 2 weeks ago

@fchollet your solution seems to be working just fine, but was wondering if it works for other layer wrappers such as time distributed layer etc.

mehtamansi29 commented 2 weeks ago

Hi @dauntless23 -

Yes.It is also work with time distributed layer as well.

main_input = Input(shape=(5,1), dtype='int32', name='main_input')
lstm = Bidirectional(LSTM(32, return_sequences=True))(main_input)
timedistributed= TimeDistributed(Dense(2))(lstm)
model = Model(inputs=main_input, outputs=timedistributed)
model.summary()