Closed spacemonqi closed 1 year ago
@daniel-v-e I am not sure how hard it is to implement it. I forgot a bit how Bidirectional
works. Is it concatenating the outputs at each step from two RNNs going forward and backwards? Does it involve "merging" the states (cells) when doing that (I don't think so but just asking)? Because TCN don't have states compared to GRU or LSTM.
If it's just concatenating outputs at each step where each TCN is independent, then it's def do-able. And it would not be so hard. I guess we can just do:
go_backwards
in the constructor.go_backwards=True
, we need to flip the time dimension before the TCN layers are called.return_state
could just be an empty list?@daniel-v-e Bidirect has been implemented.
You can check it here: https://github.com/philipperemy/keras-tcn/commit/a412190a56dc6ebe67f16d5c63a49ac3d4eb1ba7.
Pushed in 3.5.0.
Could this TCN implementation hypothetically be modified so that it can be wrapped by
tf.keras.layers.Bidirectional
?Currently it is not possible, as layers need the
go_backwards
attribute in order to be wrapped by the bidirectional layer - see https://www.tensorflow.org/api_docs/python/tf/keras/layers/Bidirectional