philipperemy / keras-tcn

Keras Temporal Convolutional Network.
MIT License
1.87k stars 454 forks source link

Support for Bidirectional Layer #225

Closed spacemonqi closed 1 year ago

spacemonqi commented 2 years ago

Could this TCN implementation hypothetically be modified so that it can be wrapped by tf.keras.layers.Bidirectional ?

Currently it is not possible, as layers need the go_backwards attribute in order to be wrapped by the bidirectional layer - see https://www.tensorflow.org/api_docs/python/tf/keras/layers/Bidirectional

philipperemy commented 2 years ago

@daniel-v-e I am not sure how hard it is to implement it. I forgot a bit how Bidirectional works. Is it concatenating the outputs at each step from two RNNs going forward and backwards? Does it involve "merging" the states (cells) when doing that (I don't think so but just asking)? Because TCN don't have states compared to GRU or LSTM.

If it's just concatenating outputs at each step where each TCN is independent, then it's def do-able. And it would not be so hard. I guess we can just do:

philipperemy commented 1 year ago

@daniel-v-e Bidirect has been implemented.

You can check it here: https://github.com/philipperemy/keras-tcn/commit/a412190a56dc6ebe67f16d5c63a49ac3d4eb1ba7.

Ref: https://keras.io/examples/nlp/bidirectional_lstm_imdb/

philipperemy commented 1 year ago

Pushed in 3.5.0.